Next Article in Journal
Analyzing the Impact of Social Media Influencers on Consumer Shopping Behavior: Empirical Evidence from Pakistan
Previous Article in Journal
Socialscape Ecology: Integrating Social Features and Processes into Spatially Explicit Marine Conservation Planning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Relationship between Learning Outcomes and Graduate Competences: The Chain-Mediating Roles of Project-Based Learning and Assessment Strategies

1
School of International Education, Jiangsu Maritime Institute, Nanjing 211170, China
2
National Higher Education Research Institute, Universiti Sains Malaysia, Gelugor 11800, Penang, Malaysia
*
Author to whom correspondence should be addressed.
Sustainability 2024, 16(14), 6080; https://doi.org/10.3390/su16146080
Submission received: 2 June 2024 / Revised: 6 July 2024 / Accepted: 10 July 2024 / Published: 16 July 2024

Abstract

:
Addressing the skill gap between labor market requirements and graduate readiness is crucial for the sustainable development of China’s vocational education system. This study investigated how outcome-based education (OBE) influences the attainment of graduate competence in China’s higher vocational education system, using the theory of Constructive Alignment (CA) as its foundation. The OBE model incorporates intended learning outcomes, project-based learning, and assessment strategies to ensure graduate competence aligns with professional sustainability practices. This study assessed the impact of intended learning outcomes, project-based learning, and assessment strategies on graduate competence attainment through surveys administered to 320 Cross-border E-commerce learners in April 2024, resulting in 301 usable responses. Data were analyzed using structural equation modeling (SEM) with SPSS 23.0 and AMOS 24.0. The results indicated that project-based learning and assessment strategies were directly and positively related to graduate competency attainment, while intended learning outcomes were indirectly associated with graduate competence through the mediating roles of project-based learning and assessment strategies. Assessment strategies had the most significant mediating effect, followed by project-based learning and the combined mediating effect. These findings advance the theoretical understanding of OBE and provide methods for promoting sustainability in vocational education. Beneficiaries include educators, policymakers, and accreditation bodies, who can use these insights to implement sustainable educational practices and ensure graduates contribute to sustainable development.

1. Introduction

Exploring new approaches in higher vocational education is driven by a significant skill gap between labor market requirements and the educational readiness of graduates, which presents a major challenge for the sustainable development of China’s vocational education system [1,2]. Recent studies have highlighted that graduates often lack the competencies demanded by employers, a problem exacerbated by the rapid advancements in information technology and globalization [3,4]. Digital transformation and automation have fundamentally altered job requirements, making it imperative for educational content and curricula to evolve accordingly [5]. These changes necessitate not only a shift in educational approaches but also a fundamental restructuring of the educational content to align with new job market demands.
In response to these changes, a growing number of educators advocate for the reverse design and customization of students’ curricula to align with their future career requirements [1]. Outcome-based education (OBE) theory has attracted the attention of academic researchers with regard to producing graduates better prepared for the workforce, thereby fostering the sustainability of higher vocational education. OBE has been recognized globally for its innovative approach to learning, creating an environment where students are motivated by practical, real-world applications of their education [6]. However, it is crucial to note that the demand for OBE varies significantly across different degree programs, with higher demand often seen in programs closely linked to industry needs, such as STEM (science, technology, engineering, and mathematics) education [7].
The OBE approach is characterized by its curriculum design, teaching and learning activities (TLAs), and assessment strategies (ASs), all driven by the exit learning outcomes that students are expected to have achieved upon completing a program [8,9]. The primary objective of OBE is to provide students with a focused and coherent academic program that enhances student engagement and develops the skills and competencies required by the market. Therefore, OBE teaching is inductive, which many academic researchers argue is more effective in motivating students to learn and enhance the attainment of competence required by the labor market. Currently, the OBE approach is highly regarded and widely accepted as a strategy for reforming and renewing education policy worldwide [6].
Despite strong advocacy and comprehensive implementation in global higher education, OBE has faced criticism from educators and academics. Challenges have arisen due to a lack of understanding among educators about the operational principles of OBE, particularly the misalignment between course learning outcomes (CLOs) and program learning outcomes [10]. Large-scale changes in long-practiced teaching and learning methods are difficult to implement [11]. Moreover, assessments often prioritize course content and grades over the ILOs [12]. These knowledge gaps of OBE implementation highlight the need for a systematic and comprehensive evaluation of OBE implementation to identify the causes of the challenges it faces and explore opportunities for enhancement [13]. Moreover, it is necessary to consider the influence of students’ personal backgrounds and intellectual aptitudes on their success in achieving learning outcomes, as highlighted by Dr. Mark Payne’s work on educational outcomes among Slovak Roma children [14], which draws on Bronfenbrenner’s “Process-Person-Context-Time” model [15].
Constructive alignment (CA) is a crucial concept for effective OBE implementation, ensuring that TLAs and ASs are aligned with ILOs to enhance competency achievement [16,17]. This alignment helps students understand what is expected of them and how to achieve different levels of learning outcomes, fostering active involvement and engagement [18]. In defining ILOs for this study, we employed Bloom’s Taxonomy to ensure clarity and constructive alignment. Bloom’s Taxonomy categorizes educational goals into six hierarchical levels: remembering, understanding, applying, analyzing, evaluating, and creating. This framework helps in crafting learning outcomes that are specific, measurable, and aligned with the competencies expected of graduates [18]. For example, action verbs such as “state”, “explain”, and “solve” call for very different competencies at various levels and need to be clearly specified from the outset to help students understand what is expected of them. By clearly specifying these verbs from the outset, educators help students understand the expectations and guide them in achieving the learning outcomes.
Project-based learning (PjBL), as a systematic teaching method, enables students to achieve different levels of competencies by linking projects with the professions in which such projects would be worked on [19]. It is a student-centered process that caters to students’ needs and encourages active involvement in their learning [20,21]. PjBL aligns with the principles of OBE, which emphasizes student-centeredness, competency development, and real-world relevance. As Tang [22] noted, implementing OBE with PjBL enhances student learning engagement. In addition, Alonzo et al. and Lui and Shum assert that the careful design of assessment strategies is crucial for implementing coherent and effective learning, thereby ensuring the learning process is more structured and goal-oriented [23,24].
Therefore, relying on the theory of CA, this study integrated PjBL as a TLA into OBE implementation and investigated how learning outcomes contribute to the attainment of AGC, as well as the mediating roles of PjBL and AS in the relationships between ILOs and AGC. We specifically focused on students enrolled in Cross-border E-commerce (CBEC) programs at higher vocational colleges in Nanjing, Jiangsu province, China. The CBEC program was selected due to the rapid growth of e-commerce globally and its significant impact on international trade, especially during the outbreak of COVID-19, underscoring the urgency of equipping students with the necessary competencies. By investigating CBEC learners’ attitudes toward OBE implementation in achieving graduate competence, our study aims to bridge the skill gap by aligning educational outcomes with the specific demands of the e-commerce sector. Additionally, this approach aims to address the critical knowledge gap of OBE implementation by emphasizing the interconnectedness of outcomes, the teaching–learning process, and assessment strategies, providing a comprehensive framework for enhancing sustainable educational practices. Specifically, this study intends to address the following research questions:
Q1. What is the relationship between intended learning outcomes and the attainment of graduate competence among the CBEC learners?
Q2. What is the relationship between project-based learning and the attainment of graduate competence among the CBEC learners?
Q3. What is the relationship between assessment strategies and the attainment of graduate competence among the CBEC learners?
Q4. How does OBE implementation affect the attainment of graduate competence among the CBEC learners?
The subsequent sections of this study are structured as follows: Section 2 provides a comprehensive literature review and outlines the hypothesis development; Section 3 introduces the research methodology employed in this study; Section 4 details the results derived from the data analysis; Section 5 engages in a discussion of the findings and explores the theoretical, practical, and managerial implications arising from the findings of this study; and Section 6 ends with the conclusions and limitations of this study.

2. Literature Review and Hypothesis Development

2.1. Outcome-Based Education: Background and Practice

Outcome-based education (OBE) can be traced back to the early 1990s, when William G. Spady proposed it as a reformative measure to enhance the quality of the American school system [25]. Since then, OBE has been widely promoted and extended to higher education, leading to profound changes in quality assurance mechanisms [26].
At its core, OBE prioritizes attaining specific, desired learning outcomes. Spady [25] defined OBE as “clearly focusing and organizing everything in an educational system around what is essential for all students to be able to do successfully at the end of their learning experiences”. This definition is supported by King and Evans, who describe outcomes as the end products of the instructional process [27]. Similarly, Syeed et al. advocate for OBE as a way of designing, developing, and documenting instruction centered on goals and outcomes [28]. In the OBE model, the teaching and learning process is oriented toward the ends, purposes, accomplishments, and results that students can achieve by the end of a program. Therefore, OBE is an educational process involving students in their learning journey to achieve well-defined outcomes.
Although the concept of OBE is straightforward, its implementation presents various challenges. Firstly, ILOs should be precisely articulated and communicated to all stakeholders, including students, educators, entrepreneurs, and academic institutions. Secondly, course content, educational strategies, teaching methods, and assessment procedures must be aligned with the learning outcomes. Lastly, the learning environment should be tailored to facilitate student engagement and support the achievement of the learning outcomes, with all modifications adequately documented. All in all, OBE requires a comprehensive strategy for teaching, measuring, and tracking well-defined outcomes that are crucial for students’ professional development.
Therefore, OBE represents a significant departure from traditional education, where teachers determine instructional content. Instead, OBE should focus on what students should be able to know/perform at the end of a learning experience. According to Ortega and Cruz [9], this paradigm shift challenged educators to engage and act as learning facilitators rather than mere conveyors of knowledge. Transitioning from conventional education to OBE necessitates focusing on desired outcomes, ensuring that instructors and learners align their attitudes with the educational objectives [29]. Therefore, a fundamental aspect of OBE is the shift towards a student-centered approach to teaching and learning processes. OBE prioritizes students’ needs, aims, and learning outcomes over traditional teacher-oriented methods. It emphasizes active learning, the direct or indirect assessment of student progress, and the practical application of knowledge. This approach fosters an environment where students take ownership of their education and are better prepared for real-world challenges. Figure 1 comprehensively explains the OBE mandates and adheres to the guidelines proposed by the Washington Accord.

2.2. Hypothesis Development

2.2.1. The Relationship between Intended Learning Outcomes and Project-Based Learning

Studies have demonstrated the direct effect of ILOs on teaching and learning activities [29]. Libba et al. found that mission statements, learning objectives, and course outcomes can vary significantly across institutions, even within the same subject area [30]. This variation leads to differences in teaching materials and methodologies, with some programs excelling in specific teaching domains, while others focus on generic competencies. Precise and reasonable objectives are essential for effectively guiding all teaching activities, ensuring that the design of these activities closely aligns with the ILOs [31].
OBE emphasizes the importance of providing clear ILOs to help students achieve desired outcomes. Knowledge is effectively transferred to students through well-defined outcomes formulated for specific programs [32]. Therefore, teachers must first consider the pre-determined outcomes of a program when designing and preparing classroom activities, guiding students toward these goals [29]. This approach aligns with the concept of CA, proposed by Biggs and Tang [18], where ILOs guide the development and implementation of teaching approaches that facilitate knowledge acquisition and skill development. In constructive alignment, the key idea is to align teaching and assessment methods with these intended learning outcomes to ensure students learn effectively and are motivated to engage in the learning process.
Currently, PjBL is considered one of the most effective teaching and learning processes for enhancing the achievement of ILOs. This hands-on approach not only aligns with OBE guidelines but also strengthens students’ learning engagement. In OBE, ILOs are strategically crafted to promote more effective learning at all levels [33].
Given these insights, we can make the following hypothesis.
H1: 
Intended learning outcomes have a significant positive and direct effect on project-based learning.

2.2.2. The Relationship between Intended Learning Outcomes and Assessment Strategies

The design and selection of ASs are deeply rooted in the relevant ILOs [34]. The effectiveness of assessment methods depends on how well they measure these outcomes [23]. They emphasize the importance of aligning assessment tasks with learning outcomes to accurately evaluate student achievements. CA further highlights the importance of developing ASs that can measure student learning relative to desired outcomes. Therefore, ILOs are crucial in shaping curriculum design, TLAs, and the overall assessment approach [35].
Given this evidence, the following hypothesis is proposed.
H2: 
Intended learning outcomes have a positive and direct effect on assessment strategies.

2.2.3. The Relationship between Intended Learning Outcomes and Graduate Competence

ILOs refer to the levels of understanding and performance that students are expected to achieve at the end of a learning experience [18,34,36]. They provide verifiable statements of what learners are expected to know, understand, and/or be able to perform [25]. Previous research has shown that different requirements of ILOs are positively related to achieving graduate employability [37,38]. Clear teaching objectives and predefined expectations can inspire students to become creative and innovative thinkers, fostering the successful implementation of OBE. Additionally, Soares et al. highlight that clearly defined learning outcomes can enhance students’ learning experiences and subsequently improve their employability prospects [39]. Therefore, hypothesis 3 can be proposed, as below.
H3: 
Intended learning outcomes positively and directly affect the attainment of graduate competence.

2.2.4. The relationships between Project-Based Learning and Graduate Competence

Previous studies have demonstrated that PjBL directly affects the development of graduate competence [40,41]. PjBL provides students with opportunities to engage in activities that foster competence development, such as sustainability assessments or stakeholder engagement, within a safe learning environment [42,43]. Empirical research has shown that PjBL enhances students’ IT skills [44], academic performance [45], and perception of the learning profession [46]. By leveraging their existing knowledge, skills, and attitudes, PjBL promotes student competence and the development of a sense of accomplishment [47].
Furthermore, PjBL integrates learning objectives into long-term projects that are related to real-world business practices and designed to solve actual business tasks and problems [48]. Through PjBL, different levels and types of graduate competence are incorporated into projects where students either work with actual clients or initiate and implement concepts they have created. Therefore, PjBL enables students to develop various professional competencies aligned with the ILOs required in the competence framework. This indicates that PjBL plays a significant role in mediating the relationship between the ILOs and AGC.
The design of ASs is informed by TLAs [36,49]. Jones et al. [50] highlight the diverse dimensions of learning, indicating that a single assessment method may not fully capture the assessment needs of a curriculum, its objectives, and outcomes. A comprehensive evaluation encompasses various types of assessments, such as formative and summative assessments, including those direct and indirect, course-focused and longitudinal, and authentic and course-embedded [51]. PjBL environment is particularly well suited for these types of assessments. Additionally, according to Guerrero-Roldán et al., selecting teaching and learning activities is essential for achieving and assessing competencies [52].
Based on the above analysis, the following hypotheses are proposed.
H4: 
Project-based learning has a positive and direct effect on the attainment of graduate competence.
H4a: 
Project-based learning plays a mediating role in the relationship between the formulation of intended learning outcomes and the attainment of graduate competence.
H5: 
Project-based learning has a positive and direct effect on assessment strategies.

2.2.5. The Relationship between Assessment Strategies and Graduate Competence

The design of ASs influences AGC [53]. ASs that are demanding, reliable, and engaging can enhance valuable student learning [54,55]. They have been widely acknowledged as essential instruments for improving the readiness of undergraduate students for future employment [56]. These findings support the notion that aligning assessments with ILOs can enhance AGC.
Furthermore, authentic assessment can positively impact students by elevating their aspirations and boosting their motivation by showcasing the connection between curricular activities and their desired graduate outcomes [57]. Similarly, teachers can use assessments to engage students in the learning and teaching activities and assessments [58], as any approach that helps students learn is critical for increasing students’ learning gains [59]. Alonzo et al. [23] have emphasized the importance of effectively aligning assessment tasks with learning outcomes to evaluate students’ achievement of said outcomes. Cifrian et al. [60] suggest that aligning assessments with learning objectives increases the number of opportunities students have to learn and practice the knowledge and skills required for successful assessment performance.
Furthermore, Biggs (2014) has proposed that constructive alignment involves defining the learning outcomes that students are expected to achieve before the teaching process begins [16]. Subsequently, teaching and assessment methods are designed to effectively facilitate attaining those outcomes and assess the level of achievement. In line with this, assessment gathers evidence that validates and documents meaningful learning in the classroom [61]. The related literature confirms the desirability of using diverse assessment methods within active learning [62], which leads to better grades and helps students achieve educational learning goals and improve their performance [63]. Studies have shown that aligning learning with learning outcomes and competencies can make assessments more transparent for students while also assisting in quality assurance and course design [52]. Therefore, it is believed that PjBL has a direct impact on the design of ASs, and PjBL and ASs play a chain-mediating role in the relationship between the formulation of ILOs and AGC.
Hence, the following hypotheses are proposed.
H6: 
Assessment strategies have a positive and direct effect on the attainment of graduate competence.
H6a: 
Assessment strategies play a mediating role in the relationship between intended learning outcomes and the attainment of graduate competence.
H6b: 
Project-based learning and assessment strategies have a chain-mediating effect in the relationship between intended learning outcomes and the attainment of graduate competence.
Informed by the theory of CA and the discussions above, a hypothesized framework was developed to examine the relationship between OBE implementation and AGC and the mediating roles of PjBL and ASs in the relationships between ILOs and AGC. This framework is depicted in Figure 2.
The proposed theoretical framework integrates ILOs, PjBL, and ASs as independent variables to achieve graduate competencies (the dependent variable) upon completion of an educational program. This framework examines how OBE is implemented within the analyzed program and how various aspects of this implementation interrelate. More importantly, it explores the mediating roles of PjBL and ASs in the relationship between ILOs and AGC. This approach aims to provide a comprehensive understanding of the pathways through which learning goals contribute to the development of competencies, mediated by PjBL and ASs, which are essential for graduates’ success in the workforce.

3. Research Methodology

3.1. Research Design

This study employed a quantitative approach using a cross-sectional research design to investigate how OBE implementation contributes to the attainment of graduate competence and evaluate the mediating roles of project-based learning and assessment strategies in the relationship between learning outcomes and graduate competence. According to Mohajan [64], a cross-sectional design involves conducting a survey at a specific point in time or over a short period, measuring the outcomes and exposures of the subjects simultaneously [65]. This approach is particularly useful for assessing OBE implementation in educational programs, as it allows researchers to evaluate the current state of OBE implementation and its impact on achieving expected graduate competencies. Furthermore, the use of questionnaires is common in cross-sectional studies [66], as they are considered to constitute one of the simplest and most reliable methods for collecting standardized data from a large sample [67]. Therefore, we conducted a cross-sectional online questionnaire survey, adapted from previous studies, to gather data on learners’ attitudes towards the effects of different aspects of OBE implementation in relation to attaining various levels of graduate competencies in academic programs.

3.2. Instrument Design and Development

The questionnaire was designed based on previous instruments whose validity and reliability had already been validated by the original researchers. The questionnaire had two primary sections. The initial section comprised demographic inquiries, encompassing gender, affiliation, major, and the duration of exposure to OBE. The subsequent section, constituting the core of the questionnaire, featured four latent variables along with 24 measurement items.
Selecting previous questionnaires was guided by two key factors: their proven validity and reliability, and their adherence to OBE guidelines, which were deemed suitable for this investigation. Additionally, since the targeted respondents primarily consisted of students, the chosen instruments were deemed relevant to this study. Authorization to adapt the questionnaire was obtained from the original researchers, ensuring ethical compliance.
Respondents were tasked with rating the measurement items on a 7-point Likert scale, ranging from 1 (strongly disagree) to 7 (strongly agree). For ILOs, seven items were adapted from Custodio et al. [68]; for PjBL, six items were sourced from Grossman et al. [69]; for ASs, seven items were drawn from Custodio et al. [68] and Baguio [70]; and for AGC, four items were adapted from Custodio et al. [68].
According to Bhattacherjee [71], content validity contributes to assessing whether the adapted measurement items align better with the latent variables in the questionnaire. Firstly, the validation process is usually confirmed by inviting experts from a related field. Three OBE experts and a deputy director were invited to validate the relevance and accuracy of the survey items related to OBE implementation in the questionnaire. Their tasks involved examining the instrument content to identify any irrelevant items, ambiguity, repetition, and language nuances.
To ensure content validity, two expert panels were assembled to scrutinize the adapted questionnaire items, identifying any potential issues such as irrelevance, ambiguity, or unclear wording. Following their recommendations, two additional demographic questions were incorporated into the initial section: respondents’ age and the duration of their involvement in PjBL. Subsequently, language translators were employed to assess the clarity and appropriateness of the items in the Chinese version, resulting in minor adjustments for linguistic accuracy.
As suggested by the language experts, a brief introduction to the OBE guidelines and the key concepts of key items such as ILOs, PjBL, and ASs were provided at the beginning of the questionnaire to enhance understanding of OBE and assess its effect on attaining graduate competence. Secondly, to assess the face validity of the adapted instrument [72], six students from the target demographic were asked to participate in a voluntary pretest of the initial questionnaire. Their feedback revealed no ambiguities or issues requiring further clarification. Additionally, a QR code was created using WeChat to allow the respondents to ask questions or provide feedback on survey items they found confusing during the survey. Further details regarding the measurement items can be found in Appendix A.

3.3. Sampling and Data Collection

The population for this study was purposively selected to be learners majoring in CBEC at higher vocational colleges in Nanjing, the capital of Jiangsu province, located in Central China. Specifically, the selected colleges include Jiangsu Maritime Institute, Jiangsu Vocational Institute of Commerce, Nanjing Vocational Institute of Railway Technology, and Nanjing Vocational College of Information Technology. These colleges were chosen as research sites because they had implemented OBE for more than two years during the research period. Additionally, Nanjing’s reputation for both commercial activity and educational excellence further enhances its suitability as a setting for studying international trade and electronic business. These colleges offer a range of business programs, like international trade, electronic business, marketing, c-commerce, and others, aimed at developing business expertise in line with evolving market demands. Therefore, learners enrolled in these business programs at the selected colleges could perceive the effects of OBE implementation and investigate the relationships between learning goals and graduate competence mediated by PjBL and ASs upon completing their program of study.
The sample size was determined according to the criterion of SEM, which recommends a minimum sample size of ten times the total number of observed variables [73]. With 30 observed variables in this survey, 350 respondents were randomly selected from the higher vocational colleges through a lottery system to participate in the survey. To facilitate data collection, the questionnaire’s link or QR code, generated automatically using Sojump APP, was shared with the class groups via WeChat by their institutional practitioners.
Before commencing the online survey, the researchers obtained permission from the Human Research Ethics Committee of University Science Malaysia (USM). Additionally, consent was obtained from the deputy deans of the involved schools at the four higher vocational colleges. Afterward, the survey was explained to the respondents to ensure their understanding of this study. Finally, participants were informed of the study’s topic, purpose, and significance and guided to anonymously and voluntarily complete the online questionnaire.
Ultimately, 301 valid responses, with a retrieval rate of 86%, were obtained and adopted for data analysis. Table 1 shows that the gender distribution of the respondents was relatively balanced, with males accounting for 55.1% and females accounting for 44.9%. The respondents aged 20–22 were the largest group in the sample (82.4%), followed by the age group of 23–25 and 26 above, accounting for 16.6% and 1.0%, respectively. The distribution of respondents among the institutions was relatively even, with each institution representing approximately 22.3%, 26.2%, 27.2%, and 24.3%, respectively. Regarding their enrolled programs, most participants specialized in cross-border electronic e-commerce (50.5%), while a smaller percentage of participants specialized in electronic e-commerce, international trade, and international business, comprising 8.6%, 13.6%, and 13.3%, respectively. Among these respondents, 27.2% had two years of experience in OBE learning, and 72.8% had three years of OBE learning experience. Furthermore, respondents were asked to specify how long they were engaged in PjBL, and it was found that 77.7% had two years of engagement in PjBL, while 22.3 had three years of PjBL experience. No respondents reported having only one year of experience in project-based engagement. Undoubtedly, these respondents were eligible for the survey and possess a certain level of representativeness.

3.4. Statistical Analysis

The data analysis was conducted using SPSS 23.0 and Amos 24.0. First, a test of normality was conducted to ensure the data distribution met the necessary assumptions for further analysis. Following this, the Harman single-factor test was implemented to assess common method bias. Furthermore, structural equation modeling (SEM) analysis was performed to investigate both the measurement and structural models. Specifically, confirmatory factor analysis (CFA) was conducted to evaluate reliability and validity, providing the values of factor loadings, composite reliability (CR), and average variance extracted (AVE). The goodness-of-fit index (GoF) and path coefficient were examined to assess the acceptability level of the structural model. Lastly, the bootstrapping method was employed to evaluate the statistical significance of the direct and mediating effects within the proposed hypotheses.

4. Results

4.1. Test of Normality

Normality can be assessed using various methods, including the Kolmogorov–Smirnov (KS) test, the Shapiro–Wilk (SW) test, skewness and kurtosis values, and graphical techniques such as histograms, box plots, or P-P plots [74]. In this study, numerical methods were employed to assess the normality of the data due to their objective nature [75], which is especially useful when experience is limited [76]. Therefore, the normality of the data was tested based on skewness and kurtosis values for 24 measurement items. Specifically, the absolute values of skewness were ensured to be less than 3, and the absolute values of kurtosis were less than 7, aligning with the basic requirements for univariate normality [77]. The results of the normality test are shown in Table 2. Notably, all absolute skewness and kurtosis values fall within the acceptable range, indicating adherence to normal data distribution requirements. This section has been divided by subheadings. It should provide a concise and precise description of the experimental results, their interpretation, and the experimental conclusions that can be drawn.

4.2. Common Method Bias (CMB)

All data were collected from a single source, namely, CBEC graduates enrolled in public higher vocational colleges, through self-reported questionnaires. This uniform data collection method may lead to consistent responses, potentially resulting in common method bias (CMB) and affecting a study’s validity and reliability [78]. To address this issue, the Harman single-factor test was conducted using SPSS 23.0 to check for CMB deviations [79]. Employing the Exploratory Factor Analysis (EFA) method for the Harmon single-factor test [80], the results indicated that four factors with eigenvalues exceeding 1 accounted for 79.800% of the total explained variance. Furthermore, the first factor, explaining 46.444% of the variance, fell below the critical threshold of 50% [81]. This suggests that CMB was insignificant in the collected data and was unlikely to impact the observed relationship among variables.

4.3. Measurement Model Verification

Anderson and Gerbing [82] advocate for using confirmatory factor analysis (CFA) to validate a measurement model. Cronbach’s alpha and McDonald’s omega are tests that measure the internal consistency of the items in a measurement tool [83]. A coefficient between 0.80 and 1.00 indicates that a tool is highly reliable [84]. Additionally, factor loading, composite reliability (CR), and average variance extracted (AVE) are commonly employed to assess convergent validity [85,86,87]. Hair et al. [88] suggest that standardized factor loadings above 0.5 can be considered significant, while values below 0.45 may warrant item deletion [89]. Viladrich et al. [90] suggests a minimum threshold of 0.70 for composite reliability to ensure reliability. Furthermore, Hair et al. [88] recommend using average variance extracted (AVE) as a measure of convergent validity in structural equation modeling (SEM), with values of 0.5 or higher considered acceptable. The reliability and validity values of the four latent variables are presented in Table 3.
As illustrated in Table 3, the obtained values of Cronbach’s alpha and McDonald’ omega for the four latent variables ranged between 0.914 and 0.967. All the values are between 0.80 and 1.00, indicating excellent reliability across the measured constructs. Furthermore, all standardized factor loadings were above 0.5 and significant at p < 0.001, with T-values ranging from 15.442 to 22.017, surpassing the suggested value of 1.96, indicating that the factor loadings of individual items in the four-factor model were all significant, which meant all the constructs had acceptable reliability. Additionally, the scores of CR ranged from 0.915 to 0.967, and the AVE scores ranged from 0.710 to 0.808, values above the suggested cut-off values of 0.70 and 0.50, respectively. These results provide strong evidence for convergent validity.
Furthermore, discriminant validity was assessed to determine the extent to which each construct in the model differed from the others [88]. This was achieved by comparing the square root of the AVE for each construct with the correlations between latent variables. According to Fornell and Larcker [91], when the square root value of the AVE is higher than the Pearson correlation coefficient value with other constructs, this indicates discriminant validity between the constructs. Table 4 presents the discriminant validity test results. The values below the diagonal represent the Pearson correlation coefficients between constructs, all of which are smaller than the square root of the AVE values on the diagonal. Conclusively, all research constructs have met the requirements for discriminant validity.

4.4. Structural Model Analysis

This study evaluated the structural model through a goodness-of-fit index (GoF) and path coefficient. Firstly, a GoF test is designed to see how well the empirical data fit the hypothesized model. Many researchers have explored using model fit indices to assess overall model quality [92,93,94]. These indices include the Tucker–Lewis Index (TLI), the Comparative Fit Index (CFI), the Incremental Fit Index (IFI), the Goodness-of-Fit Index (GFI), and the Adjusted Goodness-of-Fit Index (AGFI), with values above 0.9 indicating good model fit. Conversely, a Root Mean Square of Approximation (RMSEA) and a Standardized Root Mean Square Residual (SRMR) below 0.08 also signify a good fit [95]. Furthermore, the chi-square (χ2) value, degrees of freedom (df), and the ratio of χ2/df were used to assess overall model fit.
However, the chi-square value is sensitive to sample size and may not always produce precise results [96]. As Bollen et al. [97] noted, when the sample size is larger than 200 in SEM, it is common for the chi-square value to be excessively large, leading to a poor model fit. To mitigate this, the GoF values can be adjusted through the Bollen–Stine Bootstrap correction method, which can provide an alternative way of obtaining better results [97]. Using Chi-square divided by the degree of freedom, the ideal result should be less than 3.0 [77]. More importantly, other fit indices can provide a more rigorous structural model fit verification standard.
The recommended threshold reference values of these indices, along with the adjusted results obtained using bootstrapping techniques, are shown in Table 5. As illustrated in Table 4, the results indicate a good fit for the structural model, with χ2/df = 1.79, IFI = 0.97, CFI = 0.97, TLI = 0.97, GFI = 0.94, AGFI = 0.93, SRMR = 0.03, and RMSEA = 0.05. All the model fit indices exceeded the suggested threshold standards, confirming that the proposed structural model was acceptable. This indicates a strong alignment between the collected data and the proposed structural model.
In addition to assessing the GoF indices, SEM can be used to examine the significance and strength of the path relationships among variables. Figure 3 presents the explanatory variance and path coefficients with standardized parameter estimates.
In the hypothesized model, ILOs accounted for 22% of the variance in PjBL, with a standardized regression coefficient of 0.468, indicating a significant positive correlation between the two constructs. The combined influence of ILOs and PjBL explained 25% of the variance in ASs, with coefficients of 0.290 and 0.295, respectively, suggesting a positive association with ASs. Moreover, ILOs, PjBL, and ASs collectively explained 50% of the variance in AGC, with standardized regression coefficients of 0.026, 0.224, and 0.566, respectively, indicating satisfactory predictive strength of the hypothesized model. However, the standardized path coefficient of 0.026 for the direct path from ILOs to AGC falls below the recommended threshold value (0.20), suggesting a minimal direct contribution [98]. This weak direct relationship underscores the roles of PjBL and ASs in mediating the indirect effect of ILOs on AGC within the hypothesized framework.

4.5. Testing of the Hypotheses

Applying SEM and AMOS with maximum likelihood estimates provided robust results for evaluating the hypothesized constructs in this study. Bootstrapping methods can generate confidence intervals (CIs) with statistical power for direct effects, particularly bias-corrected bootstrapping [99,100]. Byrne [101] further noted that bootstrapping techniques can mitigate the impact of non-normal kurtosis on CFA. As detailed in Table 6, the empirical results supported five of the six hypotheses. ILOs were found to have a positive and significant influence on PjBL (PBL) (β = 0.468, p < 0.001), with a 95% percentile CI [0.230, 0.560] that does not include 0, affirming Hypothesis 1 (H1). Similarly, PBL and ASs were positively correlated (β = 0.295, p < 0.001), with a 95% percentile CI [0.064, 0.585], supporting Hypothesis 2 (H2). ILOs also had a significant positive relationship with ASs (β = 0.29, p < 0.001), with a 95% percentile CI [0.077, 0.523], confirming Hypothesis 3 (H3). PjBL was significantly linked to AGC (β = 0.224, p < 0.001), with a 95% percentile CI [0.064, 0.512], validating Hypothesis 4 (H4), and ASs were strongly associated with graduate competence (β = 0.566, p < 0.001), with a 95% percentile CI [0.311, 0.813], confirming Hypothesis 5 (H5). However, the direct effect of ILOs on AGC was not significant (β = 0.026, p > 0.05), with a 95% percentile CI [−0.082, 0.152] including 0 within the values, leading to the rejection of Hypothesis 6 (H6). This suggests that ILOs may exert an indirect influence on AGC, one potentially mediated by PBL and ASs.

4.6. Mediating Effects of Project-Based Learning and Assessment Strategies

In order to analyze the mediating effects within the hypothesized model, a bootstrap method with 1000 repeated sampling procedures at a 95% confidence interval, as suggested by [102], was employed to verify the mediating roles of PjBL and ASs. A mediating effect is recognized as significant when the Z value exceeds 1.96 and the 95% bias-corrected CI does not include 0 [103].
Table 7 illustrates the results regarding the direct, indirect, and total direct effects of the hypothesized model. At the 95% confidence level, both the confidence intervals of the bias-corrected method and the percentile method were statistically significant (Z = 3.359, 95% bias-corrected CI [0.156, 0.511], 95% percentile CI [0.155, 0.510]), indicating a significant total effect. Similarly, the total indirect effect was significant (Z = 4.235, 95% bias-corrected CI [0.171, 0.442], 95% percentile CI [0.174, 0.448]). However, the direct effect of ILOs on AGC was not significant, with both the bias-corrected and percentile method confidence intervals including 0 (Z = 0.375), indicating that PjBL and ASs fully mediate the relationship between ILOs and AGC.
To specifically examine the mediating effects of PjBL and ASs, three alternative structural models were developed and tested.
Model 1: Project-based learning as a mediator.
This model assessed the indirect effects of ILOs on AGC through PjBL. The fit indices demonstrated a satisfactory fit (χ2/df = 2.400, IFI = 0.970, CFI = 0.970, TLI = 0.965, GFI = 0.908, AGFI = 0.879, SRMR = 0.031, RMSEA = 0.068), with confidence interval values for indirect effects not including 0, supporting hypothesis 5a.
Model 2: Assessment strategies as a mediator.
This model assessed the indirect effects of ILOs on AGC through ASs, also exhibiting satisfactory fit indices (χ2/df = 2.619, IFI = 0.962, CFI = 0.961, TLI = 0.955, GFI = 0.893, AGFI = 0.861, SRMR = 0.036, RMSEA = 0.073). The confidence intervals for indirect effects did not include 0, indicating that the indirect effect of ILOs on AGC through ASs was significant, thereby verifying hypothesis 6a.
Model 3: Project-based learning and assessment strategies as chain mediators.
This integral model examined the sequential mediating effect of PjBL and ASs in the relationship between ILOs and AGC, yielding satisfactory fit indices. The results confirmed there was a positive and significant sequential mediating effect (Z = 2.031, 95% bias-corrected CI [0.02, 0.157], 95% percentile CI [0.019, 0.148]), supporting hypothesis 6b.
Finally, it may be of interest to explore the differences in mediating effects among the three models. We examined the percentage of the total indirect effect accounted for by each mediator. The mediating effect of ASs (ILO-AS-AGC) was the greatest, accounting for 44.2% of the total indirect effect. This was followed by PjBL (ILO-PjBL-AGC), which accounted for 30.2%, and the sequential mediation of PjBL and ASs (ILO-PjBL-AS-AGC), accounting for 22.6%. These results highlight the critical mediating roles of PjBL and ASs in the relationship between ILOs and AGC, demonstrating their importance in enhancing graduate competence through structured educational strategies.

5. Discussion

This section presents comprehensive quantitative findings addressing each research question. The main innovation of this study is the development of a theoretical model that explored the relationship between learning goals and graduate competence mediated by project-based learning and assessment strategies. It also assessed how OBE implementation, in terms of ILOs, PjBL, and ASs, individually and sequentially, contributed to AGC. Implementing OBE with a focus on alignment and active learning methodologies like PjBL fosters a sustainable learning environment that prepares students for the evolving demands of the job market.
(1)
In terms of Q1, the results revealed that ILOs had no statistically significant direct effect on AGC. This finding suggests that merely identifying specific ILOs does not guarantee graduate competence attainment. This aligns with previous studies on OBE implementation [34,104,105], which suggest that mechanical processes of pursuing outcomes without deliberate revision of pedagogy and assessment fail to achieve the desired competence. While a significant direct effect may be lacking, clear learning outcome statements are crucial for graduate competence attainment. When learning outcome statements are explicit about proficiencies students are to achieve, they provide reference points for students’ performance and ensure the alignment of the curriculum, instruction, and assessment. This is supported by [17], who emphasized that the essence of OBE lies in clearly specifying learning outcomes, aligning them across educational levels, and strategically planning teaching methods and assessments.
(2)
Regarding Q2, this study’s results highlight the positive and direct effect of PjBL on AGC among CBEC learners. This finding aligns with previous research by [103], which emphasizes that competencies can be effectively developed through active learning methodologies like PjBL. Unlike traditional educational models, PjBL offers a holistic, student-centered learning process. It fosters high levels of motivation, interest, and active engagement in learning, leading to enhanced academic performance and competencies relevant to the labor market. PjBL helps students better understand learning outcomes and fosters a conducive learning environment, promoting the development of higher-order competencies for their future careers and lives. Additionally, this study extends previous research by investigating the mediating role of PjBL in the relationship between ILOs and AGC. The findings confirmed that PjBL mediates the pathway from ILOs to AGC [106,107]. ILOs associated with PjBL allow students to clearly identify and state the aims and objectives of projects, stimulating students’ motivation, engagement, and self-regulation through planning, organization, and monitoring. Therefore, PjBL mediates the relationship between ILOs and AGC. Specifically, PjBL serves as a mediator, creating a more effective learning environment aligned with ILOs, thus contributing to the development of graduate competence.
(3)
In response to Q3, the results revealed that well-designed ASs had a significant positive and direct effect on AGC, as supported by previous studies [23,108]. Clear descriptions and assessment criteria help students understand what competencies they will attain, how to perform an activity, and how they will be assessed, boosting students’ comfort and confidence. Additionally, ASs play a significant mediating role in the relationship between ILOs and AGC, as highlighted by [52]. Learning outcomes guide the design of ASs, enabling teachers to enhance students’ learning gains [59]. Outcome-based assessment processes should be tailored to the specific outcomes being assessed [109], allowing formative assessment to guide the development of graduate attributes [110]. The findings also support the research conducted by Yusof et al. [111], which emphasized the importance of using multiple assessment instruments that provide students with opportunities to perform adequately, thereby helping them develop competencies and achieve the desired learning outcomes. Additionally, the findings confirmed that ILOs affected AGC through the chain-mediating effect of PjBL and ASs, as noted by Zhang and Ma [112], who found that ILOs define the expected outcomes students must achieve, while TLAs enhance student engagement and facilitate the achievement of these outcomes. This approach ensures continuous improvement in educational practices, which is essential for sustainability in education.
(4)
The final research question investigated the combined effect of OBE implementation—specifically ILOs, PjBL, and ASs—on AGC among CBEC learners. The analysis of the hypothesized model revealed that the standardized path coefficient for the direct pathway from ILOs to AGC was 0.026, which is below the recommended threshold of 0.2, indicating a minimal direct effect of ILOs on AGC. However, ILOs had an indirect effect on AGC through PjBL and ASs, suggesting that the two constructs completely mediated the relationship. The investigation of three alternative models showed that within different aspects of OBE implementation, ASs had the highest effect size with respect to AGC, followed by PjBL, which accounted for 30.2% of the total indirect effect, and the sequential mediation of PjBL and ASs, which accounted for 22.6%. Pairwise contrasts of the indirect effects demonstrated that while PjBL and ASs are correlated, their individual mediating effects are more substantial than when they are considered as a chain in the model. This indicates that each mediator contributes more significantly to AGC than the sequential mediation. This is supported by Preacher and Hayes [113], who emphasized that two indirect effects differ significantly, and a single variable can mediate more effectively than chain or sequential mediators in the hypothesized model.
Our findings have several implications for theory, practice, and management, as follows:
(1)
From a theoretical perspective, this study contributes to the existing literature on OBE implementation by optimizing OBE to promote AGC using the theory of CA [18] as its foundations. Proper educational alignment occurs when TLAs help students develop the knowledge targeted, which is accurately measured through assessment tasks [114]. Our study employed PjBL as a key TLA, which had a significant positive effect on AGC among CBEC learners. This student-centered approach fosters high levels of motivation, interest, and engagement, leading to the development of competencies that are highly relevant to the labor market. Well-defined ASs provide clear descriptions and criteria, helping students understand the competencies they will attain and boosting their confidence and performance. Therefore, the findings enrich the literature on OBE by showing that PjBL and ASs aligned with ILOs directly and positively impact AGC. Additionally, they mediate the relationship between ILOs and AGC, enriching our understanding of OBE practices.
This study also utilized Bloom’s Taxonomy to craft clear and constructively aligned learning outcomes, clarifying what learners are expected to achieve by the end of a study program, the anticipated standard of achievement, and how they should demonstrate their learning. The findings confirmed the pivotal role of Bloom’s Taxonomy in writing ILOs and aligning them with program and course outcomes, thus promoting the attainment of advanced competencies pertinent to the job market. Therefore, this study enhances the existing body of literature on OBE implementation by illustrating how Bloom’s Taxonomy serves as a foundational element in writing ILO statements and aligning them with PjBL and ASs.
(2)
This study offers several practical implications for higher vocational practitioners planning to implement OBE in higher vocational colleges. Proper training and tailored education for practitioners are essential to ensure a better understanding of OBE guidelines and constructively align ILOs with PjBL, assessments, and performance measurements, which are crucial for enhancing students’ learning experiences and improving graduates’ competencies [115]. Additionally, vocational educators should incorporate PjBL into their teaching strategies to create a positive environment for active learning. This involves engaging students in challenging tasks, promoting sustained inquiry, facilitating the discovery of answers to authentic questions, assisting in project selection, reflecting on the process, reviewing and revising work, and developing a final product for a specific audience. These practices increase the number of opportunities students have to attain ILOs relevant to their professions. This approach contributes significantly to promoting sustainability within higher vocational education. Furthermore, this research reveals that within the OBE framework, ASs have the most significant influence in terms of mediating the relationship between ILOs and AGC, even more than that of PjBL. This underscores the importance of having a diverse array of ASs that effectively measure learning outcomes, aligning with Alonzo’s work [23], which emphasizes the growing emphasis on ASs in contemporary educational programs.
(3)
From a managerial perspective, this study presents a comprehensive OBE framework for policymakers and accreditation bodies that aligns with national and international standards (e.g., the Washington Accord and NBA) to ensure academic equivalency and accreditation globally. The standardized and achievable expectations and benchmarks set by these bodies will ensure the effectiveness of the OBE process and minimize resistance from educational institutions during implementation [49]. Additionally, due to the dynamic nature of the global business landscape, e-commerce companies must continuously update their competency standards to ensure graduates possess the necessary skills to thrive in the international business environment. These updates are crucial for the continuous quality improvement of academic programs. Gurukkal [116] emphasized the potential of OBE in the global context, highlighting the need for higher-education institutions to adapt to the outcome-based education system due to the ongoing economic revolution and the demand for standardized indicators and international accreditation measures.

6. Conclusions

This study highlights the critical need for implementing OBE in higher vocational education to better align graduates’ skills with employment requirements. Focusing on CBEC learners, we utilized the theory of CA to construct and empirically validate a framework, examining the impact of ILOs, PjBL, and ASs on the AGC and the mediating roles played in the relationship between learning outcomes and graduate competence attainment.
While ILOs did not show a direct impact on AGC, they serve as essential foundations for guiding both teaching methods and assessments. Explicit ILOs provide a clear roadmap for educators and students, ensuring that educational activities are purpose-driven and aligned with desired competencies. This alignment is crucial for sustainable education as it promotes consistency and clarity in regard to educational objectives.
This study underscores the mediating roles of PjBL and ASs in the relationship between ILOs and AGC. This highlights the necessity of a holistic approach in OBE implementation, where teaching activities and assessments are seamlessly aligned with learning outcomes to optimize student competencies. Additionally, this study identifies a significant chain-mediating effect of PjBL and ASs in the relationship between ILOs and AGC. This underscores the importance of integrating multiple educational strategies to achieve comprehensive learning outcomes.
Effective implementation of the OBE approach requires constructive alignment, ensuring that measurable ILOs align with industry needs. This alignment should be reflected in all aspects of the educational process, ranging from teaching methods to assessments. By aligning educational practices with the demands of the job market, our study promotes sustainable educational practices that contribute to the long-term viability of both educational institutions and the industries they serve.
The limitations of this study should be mentioned, and directions for future studies should be suggested. Firstly, the survey was limited to candidates from public vocational colleges in Nanjing, Jiangsu province, China, potentially limiting the generalizability of the findings. Future research should encompass a broader range of institutions across China to provide a more comprehensive evaluation of OBE’s impact. Secondly, this study focused on a relatively small sample of students, excluding key stakeholders such as instructors, program designers, program evaluators, and administrators. This limitation may have restricted the depth and breadth of the analysis. Future research should expand the participant pool to include a diverse range of student groups from various disciplines as well as educators’ perspectives. This broader approach will offer a more comprehensive understanding of the impact of OBE implementation on educational outcomes. Thirdly, this study primarily explored the OBE process concerning ILOs, TLAs, AS, and AGC, overlooking other factors such as curriculum design, motivation, and institutional support. Future research should incorporate these variables to offer more robust recommendations for effective OBE implementation and competency enhancement in higher vocational education. Fourthly, assessing the attainment of set goals through uniform assessment rubrics and criteria often leads to formalism. Future studies should explore more flexible and adaptive assessment methods that better capture the complexities of achieving different levels of graduate competence. Finally, this study employed a cross-sectional design to measure the attainment of graduate competence rather than assessing continuous development through tangible products such as portfolios, exams, or other culminating products. Recognizing this limitation, it is recommended that future research adopt a longitudinal approach to track graduates’ actual workplace outcomes over time, thereby providing a more objective measure of AGC.

Author Contributions

M.L. conceived and designed the research, developed the methodology, and wrote the manuscript; M.I.R. conducted data analysis, proofreading, and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research was found by Philosophy and Social Science Research in Colleges and Universities in Jiangsu Province (Project No.2020SJB1298).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The datasets generated during the current study are available from the corresponding author on reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A. Research Questionnaire

Dear Students,
To better understand your attitudes and perceptions regarding the implementation of Outcome-Based Education (OBE) in the Cross-border E-commerce program, we have provided a brief introduction to the guidelines and practices of OBE.
What is OBE?
OBE refers to the constructive alignment of teaching and learning activities (TLAs) and assessment strategies (ASs) with intended learning outcomes (ILOs) to facilitate the attainment of graduate competence (AGC).
Key Concepts:
Graduate Competence (AGC): A set of skills that makes graduates likely to gain employment and be successful in their chosen careers.
Intended Learning Outcomes (ILOs): The level of learning outcomes that students are expected to achieve by the end of a learning experience. They are specific indicators of different levels of graduate competence.
Teaching and Learning Activities (TLAs): Activities designed to engage students and help them achieve ILOs. In this program, project-based learning (PjBL) is used as a key TLA.
Assessment Strategies (ASs): Methods used to assess the extent to which students have attained the graduate competencies, often involving rubrics and specific criteria.
Purpose of the Questionnaire:
This online questionnaire was developed based on the OBE guidelines and practices to understand your attitudes and perceptions regarding the effects of OBE implementation, particularly in terms of ILOs, PjBL, and ASs, on the attainment of graduate competence.
We value your input and ask for your cooperation in filling out this questionnaire, which will take approximately 10–15 min. Your responses are exclusively for academic research and will remain strictly confidential. Please select the number that best matches the degree to which you agree or disagree with each statement.
1 = Strongly Disagree (StrDA)
2 = Disagree (DA)
3 = Slightly Disagree (SigDA)
4 = Neither Agree nor Disagree (NAD)
5 = Slightly Agree (SigA)
6 = Agree (A)
7 = Strongly Agree (StrA).
  • Section 1: Demographic Information
1.
Gender
Male ② Female
2.
Age
19 ② 20 ③ 21 ④ 22 ⑤ 23 ⑥ 24 ⑦ 25 or above
3.
Institute: ______________________________
4.
Major: _______________________________
5.
Duration of outcome-based education learning experience
One year ② Two years ③ Three years
6.
Duration of Engagement in Project-Based Learning
One year ② Two years ③ Three years
  • Section 2:
Item CodeStatementStrongly DisagreeDisagreeSlightly DisagreeNeither Agree Nor DisagreeSlightly AgreeAgreeStrongly Agree
Intended learning outcomes (ILO)
ILO 1The intended learning outcomes are introduced to students upon admission into the program.1234567
ILO 2The intended learning outcomes are clear and understandable. 11234567
ILO 3The intended learning outcomes are relevant to future professions.1234567
ILO 4The intended learning outcomes are attainable.1234567
ILO 5The intended learning outcomes are carefully developed based on what your college requires in terms of the competences (knowledge, skills, and attitude) the program must possess.1234567
ILO 6The intended learning outcomes are carefully developed based on what the industry expects from the graduates of the program prior to their entry into the labor force.1234567
ILO 7The intended learning outcomes are carefully developed based on what students and parents expect from the program.1234567
Project-based learning (PjBL)
PjBL 1Teaching and learning activities in project-based learning are in line with the intended learning outcomes.1234567
PiBL 2Project-based learning motivates me to understand the learning outcomes they are meant to achieve.1234567
PjBL 3Project-based learning increases the motivation for the subject.1234567
PjBL 4Project-based learning helps in developing the learning process to improve learning performance.1234567
PjBL 5Opportunities for practical application of professional skills in project-based learning are adequate.1234567
PjBL 6Teaching and learning activities in project-based learning are organized appropriately for students to achieve the graduate competences.1234567
Assessment strategies (AS)
AS 1The design of assessment tasks is closely linked to the intended learning outcomes.1234567
AS 2Assessment tasks can align with the teaching and learning activities in project-based learning.1234567
AS 3The design of assessment tasks offers me the opportunity to improve my performance (e.g., knowledge, skills, and attitude).1234567
AS 4The teachers use different assessment tools to evaluate students’ progress. 1234567
AS 5I can assess how well I have acquired the expected knowledge, skills, and attitudes after the teaching and learning process.1234567
AS 6Assessment criteria and rubrics for assessing learning outcomes are explained to students at the beginning of the teaching practice.1234567
AS 7Criteria and rubrics for assessing the attainment of graduate competences are appropriate.1234567
Attainment of graduate competences (AGC)
AGC 1The attainment of graduate competences can contribute to employability.1234567
AGC 2The attainment of graduate competences can prepare me better for the workplace.1234567
AGC 3The OBE approach is the best solution to address skill mismatches in the workplace.1234567
AGC 4I think OBE will lead to greater efficiency and quality in attaining graduate competences.1234567
1 Educators crafted the intended learning outcomes using the verbs recommended in Bloom’s taxonomy.

References

  1. Guo, Y.; Zhao, Q.; Cao, Z.; Huang, S. The influence of tourism and hospitality students’ perceived effectiveness of outcome-based education on their VUCA skills. Sci. Rep. 2023, 13, 8079. [Google Scholar] [CrossRef]
  2. Teng, W.; Ma, C.; Pahlevansharif, S.; Turner, J.J. Graduate readiness for the employment market of the 4th industrial revolution: The development of soft employability skills. Educ. Train. 2019, 61, 590–604. [Google Scholar] [CrossRef]
  3. Tee, P.K.; Wong, L.C.; Dada, M.; Song, B.L.; Ng, C.P. Demand for digital skills, skill gaps and graduate employability: Evidence from employers in Malaysia. F1000Research 2024, 13, 389. [Google Scholar] [CrossRef]
  4. Rakowska, A.; de Juana-Espinosa, S. Ready for the future? Employability skills and competencies in the twenty-first century: The view of international experts. Hum. Syst. Manag. 2021, 40, 669–684. [Google Scholar] [CrossRef]
  5. Hannan, S.A. Development of Digital Transformation in Higher Education Institutions. J. Comput. Sci. Comput. Math. 2023, 13, 1–8. [Google Scholar] [CrossRef]
  6. Gurukkal, R. Towards outcome-based education. High. Educ. Future 2018, 5, 1–3. [Google Scholar] [CrossRef]
  7. Nurković, R. STEM education in teaching geography in Bosnia and Herzegovina. Folia Geogr. 2020, 62, 127. [Google Scholar]
  8. Harden, R.M. AMEE Guide No. 14: Outcome-based education: Part 1-An introduction to outcome-based education. Med. Teach. 2009, 21, 7–14. [Google Scholar] [CrossRef]
  9. Ortega, R.A.A.; Cruz, R.A.O.-D. Educators’ Attitude towards Outcomes-Based Educational Approach in English Second Language Learning. Am. J. Educ. Res. 2016, 4, 597–601. [Google Scholar] [CrossRef]
  10. Ling, Y.; Chung, S.J.; Wang, L. Research on the reform of management system of higher vocational education in China based on personality standard. Curr. Psychol. 2023, 42, 1225–1237. [Google Scholar] [CrossRef]
  11. Guzman, M.F.D.D.; Edaño, D.C.; Umayan, Z.D. Understanding the Essence of the Outcomes-Based Education (OBE) and Knowledge of its Implementation in a Technological University in the Philippines. Asia Pac. J. Multidiscip. Res. 2017, 5, 64–71. [Google Scholar]
  12. Erdem, Y.S. Teaching and Learning at Tertiary-Level Vocational Education: A Phenomenological Inquiry into Administrators’, Teachers, and Students, Perceptions and Experiences. Ph.D. Thesis, METU—Middle East Technical University, Ankara, Türkiye, 2019. [Google Scholar]
  13. Glasser, S.P.; Glasser, P. Essentials of Clinical Research; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar] [CrossRef]
  14. Payne, M.I. Slovak Roma Village of Origin and Educational Outcomes: A Critical Evaluation. Folia Geogr. 2018, 60, 31–49. [Google Scholar]
  15. Bronfenbrenner, U. Making Human Beings Human: Bioecological Perspectives on Human Development; Sage: London, UK, 2005. [Google Scholar]
  16. Biggs, J. Constructive Alignment in University Teaching. HERDSA Rev. High. Educ. 2014, 1, 5–22. Available online: http://www.herdsa.org.au/herdsa-review-higher-education-vol-1/5-22 (accessed on 1 June 2024).
  17. Thian, L.B.; Ng, F.P.; Ewe, J.A. Constructive alignment of graduate capabilities: Insights from implementation at a private university in Malaysia. Malays. J. Learn. Instr. 2018, 15, 111–142. [Google Scholar] [CrossRef]
  18. Biggs, J.; Tang, C. Teaching for Quality Learning at University, 4th ed.; Society for Research into Higher Education: London, UK, 2011. [Google Scholar]
  19. Mateos Naranjo, E.; Redondo Gómez, S.; Serrano Martín, L.; Delibes Mateos, M.; Zunzunegui González, M. Implantación de una metodología docente activa en la asignatura de Redacción y Ejecución de Proyectos del Grado en Biología. Rev. Estud. Exp. Educ. 2020, 19, 259–274. [Google Scholar]
  20. Mahedo, M.T.D.; Bujez, A.V. Project Based Teaching as a Didactic Strategy for the Learning and Development of Basic Competence in Future Teachers. Proc. Soc. Behav. Sci. 2014, 141, 232–236. [Google Scholar] [CrossRef]
  21. Pradana, H.D. Project-Based Learning: Lecturer Participation and Involvement in Learning in Higher Education. J. Lesson Learn. Stud. 2023, 6, 287–296. [Google Scholar]
  22. Tang, D.K.H. A Case Study of Outcome-based Education: Reflecting on Specific Practices between a Malaysian Engineering Program and a Chinese Science Program. Innov. Teach. Learn. 2021, 3, 86–104. Available online: https://www.researchgate.net/publication/352864157 (accessed on 1 June 2024).
  23. Alonzo, D.; Bejano, J.; Labad, V. Alignment between Teachers’ Assessment Practices and Principles of Outcomes-Based Education in the Context of Philippine Education Reform. Int. J. Instr. 2023, 16, 489–506. [Google Scholar] [CrossRef]
  24. Lui, G.; Shum, C. Outcome-Based Education and Student Learning in Managerial Accounting in Hong Kong. J. Case Stud. Accredit. Assess. 2012, 2, 1–13. [Google Scholar]
  25. Spady, W. Outcome-Based Education: Critical Issues and Answers; American Association of School Administrators: Arlington, TX, USA, 1994. [Google Scholar]
  26. Pham, H.T.; Nguyen, P.V. ASEAN quality assurance scheme and Vietnamese higher education: A shift to outcomes-based education? Qual. High. Educ. 2023, 1–28. [Google Scholar] [CrossRef]
  27. King, J.A.; Evans, K.M. Can we achieve outcome-based education? Educ. Leadersh. 1991, 49, 73–75. [Google Scholar]
  28. Syeed, M.M.; Shihavuddin, A.S.M.; Uddin, M.F.; Hasan, M.; Khan, R.H. Outcome based education (OBE): Defining the process and practice for engineering education. IEEE Access 2022, 10, 119170–119192. [Google Scholar] [CrossRef]
  29. Orfan, S.N. Political participation of Afghan Youths on Facebook: A case study of Northeastern Afghanistan. Cogent Soc. Sci. 2021, 7, 1857916. [Google Scholar] [CrossRef]
  30. McMillan, L.; Johnson, T.; Parker, F.M.; Hunt, C.W.; Boyd, D.E. Improving Student Learning Outcomes through a Collaborative Higher Education Partnership. Int. J. Teach. Learn. High. Educ. 2020, 32, 117–124. [Google Scholar]
  31. Wu, Y.; Xu, L.; Philbin, S.P. Evaluating the Role of the Communication Skills of Engineering Students on Employability According to the Outcome-Based Education (OBE) Theory. Sustainability 2023, 15, 9711. [Google Scholar] [CrossRef]
  32. Akramy, S.A. Implementation of Outcome-Based Education (OBE) in Afghan Universities: Lecturers’ voices. Int. J. Qual. Educ. 2021, 5, 27–47. [Google Scholar]
  33. Driscoll, A.; Wood, S. Developing Outcomes-Based Assessment for Learner-Centered Education: A Faculty Introduction; Taylor Francis: Abingdon, UK, 2023. [Google Scholar]
  34. Rao, N.J. Outcome-based education: An outline. High. Educ. Future 2020, 7, 5–21. [Google Scholar] [CrossRef]
  35. Glasson, T. Improving Student Achievement: A Practical Guide to Assessment for Learning; Curriculum Press: Wolverhampton, UK, 2009. [Google Scholar]
  36. Tam, M. Outcomes-based approach to quality assessment and curriculum improvement in higher education. Qual. Assur. Educ. 2014, 22, 158–168. [Google Scholar] [CrossRef]
  37. Kim, S.U. Exploring the knowledge development process of English language learners at a high school: How do English language proficiency and the nature of research task influence student Learning? J. Assoc. Inform. Sci. Technol. 2015, 66, 128–143. [Google Scholar] [CrossRef]
  38. Tong, M.; Gao, T. For Sustainable Career Development: Framework and Assessment of the Employability of Business English Graduates. Front. Psychol. 2022, 13, 847247. [Google Scholar] [CrossRef] [PubMed]
  39. Soares, I.; Dias, D.; Monteiro, A.; Proença, J. Learning outcomes and employability: A case study on management academic programmes. In Proceedings of the 11th International Technology, Education and Development Conference, Valencia, Spain, 6–8 March 2017. [Google Scholar] [CrossRef]
  40. Lozano, R.; Merrill, M.; Sammalisto, K.; Ceulemans, K.; Lozano, F. Connecting competences and pedagogical approaches for sustainable development in higher education: A literature review and framework proposal. Sustainability 2017, 9, 1889. [Google Scholar] [CrossRef]
  41. Lozano, R.; Barreiro-Gen, M.; Lozano, F.; Sammalisto, K. Teaching sustainability in European higher education institutions: Assessing the connections between competences and pedagogical approaches. Sustainability 2019, 11, 1602. [Google Scholar] [CrossRef]
  42. Cörvers, R.; Wiek, A.; de Kraker, J.; Lang, D.J.; Martens, P. Problem-based and project-based learning for sustainable development. In Sustainability Science: An Introduction; Springer: Dordrecht, The Netherlands, 2016; pp. 349–358. [Google Scholar]
  43. Heiskanen, E.; Thidell, Å.; Rodhe, H. Educating sustainability change agents: The importance of practical skills and experience. J. Cleaner Prod. 2016, 123, 218–226. [Google Scholar] [CrossRef]
  44. Hron, A.; Friedrich, H.F. A review of web-based collaborative learning: Factors beyond technology. J. Comput. Assist. Learn. 2003, 19, 70–79. [Google Scholar] [CrossRef]
  45. Baran, M.; Maskan, A. The effect of project-based learning on pre-service physics teachers’ electrostatic achievements. Cypriot J. Educ. Sci. 2011, 5, 243–257. [Google Scholar]
  46. Lavy, I.; Shriki, A. Investigating changes in prospective teachers’ views of a “good teacher” while engaging in computerized project-based learning. J. Math. Teach. Educ. 2008, 11, 259–284. [Google Scholar] [CrossRef]
  47. Gallagher, S.E.; Savage, T. Challenge-based learning in higher education: An exploratory literature review. Teach. High. Educ. 2023, 28, 1135–1157. [Google Scholar] [CrossRef]
  48. Almulla, M.A. The effectiveness of the Project-Based Learning (PBL) approach as a way to engage students in learning. Sage Open 2020, 10, 2158244020938702. [Google Scholar] [CrossRef]
  49. Gunarathne, N.; Senaratne, S.; Senanayake, S. Outcome-based education in accounting. J. Econ. Admin. Sci. 2019, 36, 16–37. [Google Scholar] [CrossRef]
  50. Jones, M.D.; Hutcheson, S.; Camba, J.D. Past, present, and future barriers to digital transformation in manufacturing: A review. J. Manuf. Syst. 2021, 60, 936–948. [Google Scholar] [CrossRef]
  51. Inés, J.; Daniel, D.; Belén, B. Learning outcomes-based assessment in distance higher education: A case study. Open Learn. J. Open Distance e-Learn. 2020, 37, 193–208. [Google Scholar] [CrossRef]
  52. Guerrero-Roldán, A.E.; Noguera, I. A model for aligning assessment with competences and learning activities in online courses. Internet High. Educ. 2018, 38, 36–46. [Google Scholar] [CrossRef]
  53. Jackson, D. Skill mastery and the formation of graduate identity in Bachelor graduates: Evidence from Australia. Stud. High. Educ. 2016, 41, 1313–1332. [Google Scholar] [CrossRef]
  54. Carless, D.; Bridges, S.M.; Chan, C.K.Y.; Glofcheski, R. (Eds.) Scaling Up Assessment for Learning in Higher Education; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
  55. Sadler, D.R. Three in-Course Assessment Reforms to Improve Higher Education Learning Outcomes. Assess. Eval. High. Educ. 2016, 41, 1081–1099. [Google Scholar] [CrossRef]
  56. Colthorpe, K.; Gray, H.; Ainscough, L.; Ernst, H. Drivers for authenticity: Student approaches and responses to an authentic assessment task. Assess. Eval. High. Educ. 2021, 46, 995–1007. [Google Scholar] [CrossRef]
  57. Frey, B.B.; Schmitt, V.L.; Allen, J.P. Defining Authentic Classroom Assessment. Pract. Assess. Res. Eval. 2012, 17, 2. [Google Scholar]
  58. Black, P.; Wiliam, D. Developing the theory of formative assessment. Educ. Assess. Eval. Account. 2009, 21, 5–31. [Google Scholar] [CrossRef]
  59. Lau, K.L.; Ho, E.S.C. Reading performance and self-regulated learning of Hong Kong students: What we learnt from PISA 2009. Asia-Pac. Educ. Res. 2016, 25, 159–171. [Google Scholar] [CrossRef]
  60. Cifrian, E.; Andrés, A.; Galán, B.; Viguri, J.R. Integration of different assessment approaches: Application to a project-based learning engineering course. Educ. Chem. Eng. 2020, 31, 62–75. [Google Scholar] [CrossRef]
  61. Wiggins, G.; McTighe, J. Understanding by Design; Association for Supervision and Curriculum Development: Alexandria, VA, USA, 2005. [Google Scholar]
  62. Lynam, S.; Cachia, M. Students’ perceptions of the role of assessments at higher education. Assess. Eval. High. Educ. 2018, 43, 223–234. [Google Scholar] [CrossRef]
  63. Day, I.N.Z.; van Blankenstein, F.M.; Westenberg, M.; Admiraal, W. A review of the characteristics of intermediate assessment and their relationship with student grades. Assess. Eval. High. Educ. 2018, 43, 908–929. [Google Scholar] [CrossRef]
  64. Mohajan, H. Munich Personal RePEc Archive Quantitative Research: A Successful Investigation in Natural and Social Sciences. J. Econ. Dev. Environ. People 2020, 9, 52–79. [Google Scholar] [CrossRef]
  65. Wang, X.; Cheng, Z. Cross-Sectional Studies: Strengths, Weaknesses, and Recommendations. Chest 2020, 158, 65–71. [Google Scholar] [CrossRef]
  66. Creswell, J.W. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches, 4th ed.; Sage Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar] [CrossRef]
  67. Gay, L.R.; Mills, G.E.; Airasian, P.W. Educational Research: Competencies for Analysis and Application, 9th ed.; Person Prentice Hall: Hoboken, NJ, USA, 2009. [Google Scholar]
  68. Custodio, P.C.; Espita, G.N.; Siy, L.C. The implementation of outcome-based education at a Philippine University. Asia Pac. J. Multidiscip. Res. 2019, 7, 37–49. [Google Scholar]
  69. Grossman, P.; Dean, C.G.P.; Kavanagh, S.S.; Herrmann, Z. Preparing teachers for project-based teaching. Phi Delta Kappan 2019, 100, 43–48. [Google Scholar] [CrossRef]
  70. Baguio, J.B. Outcomes-based education: Teachers’ attitude and implementation. Univ. Bohol Multidiscip. Res. J. 2019, 7, 110–127. [Google Scholar] [CrossRef]
  71. Bhattacherjee, A. Social Science Research: Principles, Methods, and Practices. 2012. Available online: http://scholarcommons.usf.edu/oa_textbooks/3 (accessed on 1 June 2024).
  72. Boateng, G.O.; Neilands, T.B.; Frongillo, E.A.; Melgar-Quiñonez, H.R.; Young, S.L. Best practices for developing and validating scales for health, social, and behavioral research: A primer. Front. Public Health 2018, 6, 149. [Google Scholar] [CrossRef]
  73. Zhang, W.; Xu, M.; Su, H. Dance with Structural Equations; Xiamen University Press: Xiamen, China, 2020. [Google Scholar]
  74. Orcan, F. Parametric or non-parametric: Skewness to test normality for mean comparison. Int. J. Assess. Tools Educ. 2020, 7, 255–265. [Google Scholar] [CrossRef]
  75. Misra, B.B. Data normalization strategies in metabolomics: Current challenges, approaches, and tools. Eur. J. Mass Spectrom. 2020, 26, 165–174. [Google Scholar] [CrossRef]
  76. Kim, H. Statistical notes for clinical researchers: Assessing normal distribution (2) using skewness and kurtosis. Open Lect. Stat. 2013, 52, 54. [Google Scholar] [CrossRef] [PubMed]
  77. Kline, R.B. Principles and Practice of Structural Equation Modeling, 2nd ed.; The Guilford Press: New York, NY, USA, 2004. [Google Scholar]
  78. Podsakoff, P.M.; MacKenzie, S.B.; Podsakoff, N.P. Sources of method bias in social science research and recommendations on how to control it. Ann. Rev. Psychol. 2012, 63, 539–569. [Google Scholar] [CrossRef] [PubMed]
  79. Podsakoff, P.M.; Mackenzie, S.B.; Lee, J.Y.; Podsakoff, N.P. Common method biases in behavioral research: A critical review of the literature and recommended remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef] [PubMed]
  80. Podsakoff, P.M.; Organ, D.W. Self-reports in organizational research: Problems and prospects. J. Manag. 1986, 12, 531–544. [Google Scholar] [CrossRef]
  81. Hair, J.F.; William, C.B.; Babin, B.J.; Anderson, R.E. Multivariate Data Analysis; Prentice Hall: Hoboken, NJ, USA, 2010. [Google Scholar]
  82. Anderson, J.C.; Gerbing, D.W. Structural equation modeling in practice e a review and recommended two step approach. Psychol. Bull. 1988, 103, 411–423. [Google Scholar] [CrossRef]
  83. Tekindal, M.A.; Tekin, M.E.; Çevrimli, M.B.; Aslım, G.; Mat, B.; Günlü, A.; Yaşar, A. The Validity and Reliability Study for Evaluation of Factors Affecting Workplace Satisfaction of Academic Staff: Faculty of Veterinary Medicine. Vet. Hekimler Derneği Derg. 2019, 91, 15–24. [Google Scholar] [CrossRef]
  84. Field, A. Discovering Statistics Using SPSS, 3rd ed.; Sage Publications Ltd.: Thousand Oaks, CA, USA, 2009. [Google Scholar]
  85. Chen, S.; Lin, C. Understanding the effect of social media marketing activities: The mediation of social identification, perceived value, and satisfaction. Technol. Forecast. Soc. Chang. 2019, 140, 22–32. [Google Scholar] [CrossRef]
  86. Huang, S.W.; Chang, T.Y. Social image impacting attitudes of middle-aged and elderly people toward the usage of walking aids: An empirical investigation in Taiwan. Healthcare 2020, 8, 543. [Google Scholar] [CrossRef]
  87. Jo, H. Understanding AI tool engagement: A study of ChatGPT usage and word-of-mouth among university students and office workers. Telemat. Inform. 2023, 85, 102067. [Google Scholar] [CrossRef]
  88. Hair, J.P.; Black, J.P.; Babin, J.P.; Anderson, R.E. Multivariate Data Analysis, 8th ed.; Cengage Learning: Harlow, UK, 2019. [Google Scholar]
  89. Hooper, D.; Coughlan, J.; Mullen, M. Structural equation modelling: Guidelines for determining model fit. Electron. J. Bus. Res. Methods 2008, 6, 53–60. [Google Scholar]
  90. Viladrich, C.; Angulo-Brunet, A.; Doval, E. A journey around alpha and omega to estimate internal consistency reliability. An. Psicol. 2017, 33, 755–782. [Google Scholar] [CrossRef]
  91. Fornell, C.; Larcker, D.F. Structural equation models with unobservable variables and measurement error: Algebra and statistics. J. Mark. Res. 1981, 18, 295–336. [Google Scholar] [CrossRef]
  92. Clark, D.A.; Bowles, R.P. Model fit and item factor analysis: Overfactoring, underfactoring, and a program to guide interpretation. Multivar. Behav. Res. 2018, 53, 544–558. [Google Scholar] [CrossRef]
  93. Montoya, A.K.; Edwards, M.C. The poor fit of model fit for selecting number of factors in exploratory factor analysis for scale evaluation. Educ. Psychol. Meas. 2021, 81, 413–440. [Google Scholar] [CrossRef] [PubMed]
  94. Shi, D.; Lee, T.; Maydeu-Olivares, A. Understanding the model size effect on SEM fit indices. Educ. Psychol. Meas. 2019, 79, 310–334. [Google Scholar] [CrossRef] [PubMed]
  95. Bagozzi, R.P.; Yi, Y. Specification, evaluation, and interpretation of structural equation models. J. Acad. Mark. Sci. 2012, 40, 8–34. [Google Scholar] [CrossRef]
  96. Tong, X.; Bentler, P.M. Evaluation of a new mean scaled and moment adjusted test statistic for SEM. Struct. Equ. Model. 2013, 20, 148–156. [Google Scholar] [CrossRef] [PubMed]
  97. Bollen, K.A.; Stine, R.A. Bootstrapping goodness-of-fit measures in structural equation models. Sociol. Methods Res. 1992, 21, 205–229. [Google Scholar] [CrossRef]
  98. Chin, W.W. The partial least squares approach to structural equation modelling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  99. Briggs, N. Estimation of the Standard Error and Confidence Interval of the Indirect Effect in Multiple Mediator Models; The Ohio State University: Columbus, UK, 2006. [Google Scholar]
  100. Williams, J.; MacKinnon, D.P. Resampling and Distribution of the Product Methods for Testing Indirect Effects in Complex Models. Struct. Equ. Model. 2008, 15, 23–51. [Google Scholar] [CrossRef]
  101. Byrne, B.B. Structural Equation Modeling with Amos: Basic Concepts, Applications, and Programming, 3rd ed.; Routledge: New York, NY, USA, 2016. [Google Scholar]
  102. MacKinnon, D.P. Introduction to Statistical Mediation Analysis; Erlbaum: Mahwah, NJ, USA, 2008. [Google Scholar]
  103. Shao, Y.; Kang, S. The association between peer relationship and learning engagement among adolescents: The chain mediating roles of self-efficacy and academic resilience. Front. Psychol. 2022, 13, 938756. [Google Scholar] [CrossRef] [PubMed]
  104. Akhmadeeva, L.; Hindy, M.; Sparrey, C.J. Overcoming obstacles to implementing an outcome-based education model: Traditional versus transformational OBE. In Proceedings of the Canadian Engineering Education Association, Montreal, QC, Canada, 17–20 June 2013; pp. 1–5. [Google Scholar] [CrossRef]
  105. Wang, L. Designing and implementing outcome-based learning in a linguistics course: A case study in Hong Kong. Procedia Soc. Behav. Sci. 2011, 12, 9–18. [Google Scholar] [CrossRef]
  106. Aji, G.S.; Darmadi, D.; Rohmawati, Y.I.I. Improving Learning Outcomes and Student Responses through Project Based Learning Model on Light and Optical Instruments. JPPIPA J. Penelit. Pendidik. IPA 2023, 8, 35–42. [Google Scholar] [CrossRef]
  107. Upadhye, V.; Madhe, S.; Joshi, A. Project Based Learning as an Active Learning Strategy in Engineering Education. J. Eng. Educ. Trans. 2022, 36, 22169. [Google Scholar] [CrossRef]
  108. Mandinach, E.B.; Gummer, E.S. What does it mean for teachers to be data literate: Laying out the skills, knowledge, and dispositions. Teach. Teach. Educ. 2016, 60, 366–376. [Google Scholar] [CrossRef]
  109. Bagban, T.I.; Patil, S.R.; Gat, A.; Shirgave, S.K. On Selection of Assessment Methods in Outcome Based Education (OBE). J. Eng. Educ. Transform. 2017, 30, 327–332. [Google Scholar]
  110. Spracklin-Reid, D.; Fisher, A. Course-based learning outcomes as the foundation for assessment of graduate attributes-an update on the progress of Memorial University. In Proceedings of the Canadian Engineering Education Association, Montreal, QC, Canada, 17–20 June 2013. [Google Scholar] [CrossRef]
  111. Yusof, R.; Othman, N.; Norwani, N.M.; Ahmad, N.L.B.; Jalil, N.B.A. Implementation of outcome-based education (OBE) in accounting programme in higher education. Int. J. Acad. Res. Bus. Soc. Sci. 2017, 7, 1186–1200. [Google Scholar] [CrossRef]
  112. Zhang, L.; Ma, Y. A study of the impact of project-based learning on student learning effects: A meta-analysis study. Front. Psychol. 2023, 14, 1202728. [Google Scholar] [CrossRef] [PubMed]
  113. Preacher, K.J.; Hayes, A.F. Asymptotic and resampling strategies for assessing and comparing indirect effects in multiple mediator models. Behav. Res. Methods 2008, 40, 879–891. [Google Scholar] [CrossRef]
  114. Jaiswal, P. Using Constructive Alignment to Foster Teaching Learning Processes. Eng. Lang. Teach. 2019, 12, 10. [Google Scholar] [CrossRef]
  115. Kaliannan, M.; Chandran, S.D. Empowering students through outcome-based education (OBE). Res. Educ. 2012, 87, 50–63. [Google Scholar] [CrossRef]
  116. Gurukkal, R. Outcome-based education: An open framework. High. Educ. Future 2020, 7, 1–4. [Google Scholar] [CrossRef]
Figure 1. OBE framework defining the process followed in an educational program.
Figure 1. OBE framework defining the process followed in an educational program.
Sustainability 16 06080 g001
Figure 2. A theoretical model of this study (Note: the unlabeled points H5a, H6a, and H6b represent the mediating effect and chain mediating effect of PjBL and ASs).
Figure 2. A theoretical model of this study (Note: the unlabeled points H5a, H6a, and H6b represent the mediating effect and chain mediating effect of PjBL and ASs).
Sustainability 16 06080 g002
Figure 3. Path diagram and standardized estimate.
Figure 3. Path diagram and standardized estimate.
Sustainability 16 06080 g003
Table 1. Demographic characteristics of the respondents.
Table 1. Demographic characteristics of the respondents.
Demographic ProfileClassificationFrequency (N = 301)Valid Percentage (%)
GenderMale16655.1
Female13544.9
Age 20–2224882.4
23–255016.6
26 above31.0
InstitutionsCollege A6722.3
College B7926.2
College C8227.2
College D7324.3
Fields of studyElectronic commerce268.6
International trade4113.6
International business4013.3
Cross-border electronic commerce15250.5
Marketing4214.0
Duration of Outcome-based education learning experience1 year00.0
2 years8227.2
3 years21972.8
Duration of Engagement in PjBL 11 year00.0
2 years23477.7
3 years6722.3
1 Project-based learning.
Table 2. Results of the normality test.
Table 2. Results of the normality test.
ConstructItemsMeanSkewnessKurtosis
Intended learning outcomesILO15.59−1.5961.486
ILO25.59−1.5421.328
ILO35.63−1.5691.393
ILO45.58−1.5761.456
ILO55.58−1.5371.29
ILO65.59−1.541.285
ILO75.63−1.5911.56
Project-based learningPjBL15.95−1.9863.864
PjBL26.04−2.0694.232
PjBL35.92−1.9573.73
PjBL46.03−2.0864.711
PjBL56.01−2.0054.345
PjBL65.96−1.9953.688
Assessment strategiesAS16.02−2.0644.06
AS25.92−1.9313.145
AS36−2.074.018
AS46.02−2.083.99
AS55.99−2.124.309
AS65.99−2.1114.029
AS76.01−2.1394.453
Attainment of graduate competencesAGC15.98−2.1344.278
AGC26.04−2.2495.107
AGC35.96−2.0143.91
AGC45.96−2.0673.726
Table 3. Reliability and validity results.
Table 3. Reliability and validity results.
Latent VariableItemSCSET-Valuep-ValueCronbach’s aMcDonald’ OmegaCRAVE
Intended learning outcomes (ILOs)ILO10.821///0.9670.9670.967 0.808
ILO20.9090.05420.444***
ILO30.9560.05222.37***
ILO40.8990.05520.052***
ILO50.9360.05421.5***
ILO60.8980.05520.009***
ILO70.8650.05518.826***
Project-based learning (PjBL)PjBL10.824///0.9470.9480.948 0.754
PjBL20.8980.05319.953***
PjBL30.8170.05917.122***
PjBL40.9110.05120.459***
PjBL50.9430.04921.7***
PjBL60.8060.0616.797***
Assessment strategies (ASs)AS10.894///0.9440.9450.945 0.710
AS20.7850.05317.864***
AS30.8390.04720.325***
AS40.840.04720.334***
AS50.8360.04720.151***
AS60.8060.0518.782***
AS70.8920.04323.19***
Attainment of graduate competence (ATC)AGC10.827///0.9140.9150.915 0.730
AGC20.8660.05617.99***
AGC30.8910.05718.711***
AGC40.8310.06216.943***
Notes: SC: standardized coefficients; SE: standard error; ***: p < 0.001.
Table 4. The discriminatory validity test of latent variables.
Table 4. The discriminatory validity test of latent variables.
Latent VariableAGCASsPjBL
AGC0.854
ASs0.674 0.843
PjBL0.480 0.431 0.868
ILOs0.373 0.428 0.468
Notes: The bold diagonal values are the square roots of AVE, while the construct Pearson correlations are presented off-diagonal.
Table 5. Goodness-of-fit index of the structural model.
Table 5. Goodness-of-fit index of the structural model.
Fit IndicesCriteriaModel Fit of the Research Model
χ2The smaller, the better440.19
dfThe larger, the better246.00
Normed Chi-square (χ2/df)1 < χ2/df < 31.79
RMSEA<0.080.05
SRMR<0.080.03
TLI>0.90.97
CFI>0.90.97
GFI>0.90.94
AGFI>0.90.93
IFI>0.90.97
Notes: χ2: Chi-square; df: Degree of Freedom; RMSEA: Root Mean Square Error of Approximation; SRMR: Standardized Root Mean Square Residual; TLI: Tucker–Lewis Index; CFI: Comparative Fit Index; GFI: Goodness-of-Fit Index; AGFI: adjusted goodness-of-fit index; IFI: incremental fit index.
Table 6. The test results regarding the path relationship.
Table 6. The test results regarding the path relationship.
HypothesisPathUnstd. EstimateS.E.T-ValueSig.Std. EstimateBootstrappingHypothesis Test
LowerUpper
H1ILOs→PjBL0.3740.0487.878***0.4680.230.56Supported
H2PjBL→ASs0.3260.0694.742***0.2950.0640.585Supported
H3ILOs→ASs0.2570.0554.701***0.290.0770.513Supported
H4PjBL→AGC0.2320.0593.963***0.2240.0640.512Supported
H5ASs→AGC0.530.0579.383***0.5660.3110.813Supported
H6ILOs→AGC0.0210.0450.4730.6360.026−0.0820.152Not supported
Note: Unstd.: Unstandardized estimate; S.E.: standard error; Sig.: significance; Std.: standardized estimate; ***: p < 0.001.
Table 7. Direct, indirect, and total effects of the hypothesized model.
Table 7. Direct, indirect, and total effects of the hypothesized model.
Path RelationshipPoint EstimateProduct of CoefficientBootstrapping
Bias-Corrected 95% CIPercentile 95% CI
SEZ-ValueLowerUpperLowerUpper
Test of indirect, direct, and total effects
DistalIEILO-PjBL-AS-AGC0.0650.0322.031 0.020.1570.0190.148
PjBLIEILO-PjBL-AGC0.0870.0441.977 0.0260.2090.0230.194
ASIEILO-AS-AGC0.1360.0672.030 0.0410.3180.0270.296
TIETotal indirect effect0.2880.0684.235 0.1710.4420.1740.448
DEILO-AGC0.0210.0560.375 −0.0820.152−0.0850.142
TETotal effect0.3090.0923.359 0.1560.5110.1550.51
Comparison of indirect effects
PjBLDIEdiffPiBL vs. DistalIE0.0220.0560.393 −0.0970.126−0.0910.13
ASDIEdiffAS vs.DistalIE0.0720.0810.889 −0.050.285−0.0760.247
PjBLASdiffPjBL vs. AS−0.050.091−0.549 −0.2470.108−0.2350.118
Percentage of indirect effects
P1DistalIE/TIE0.2250.112.045 0.0620.50.060.496
P2PjBLIE/TIE0.3020.1392.173 0.080.6020.0760.595
P3ASIE/TIE0.4730.1662.849 0.1460.7850.120.758
P4TIE/TE0.9310.2274.101 0.6591.4290.6651.444
P5DE/TE0.0690.2270.304 −0.4290.341−0.4440.335
Note: DistalIE: distant indirect effect; TIE: total indirect effect; DE: direct effect; TE: total effect.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, M.; Rohayati, M.I. The Relationship between Learning Outcomes and Graduate Competences: The Chain-Mediating Roles of Project-Based Learning and Assessment Strategies. Sustainability 2024, 16, 6080. https://doi.org/10.3390/su16146080

AMA Style

Li M, Rohayati MI. The Relationship between Learning Outcomes and Graduate Competences: The Chain-Mediating Roles of Project-Based Learning and Assessment Strategies. Sustainability. 2024; 16(14):6080. https://doi.org/10.3390/su16146080

Chicago/Turabian Style

Li, Ming, and Mohd Isa Rohayati. 2024. "The Relationship between Learning Outcomes and Graduate Competences: The Chain-Mediating Roles of Project-Based Learning and Assessment Strategies" Sustainability 16, no. 14: 6080. https://doi.org/10.3390/su16146080

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop