Next Article in Journal
Detection Method of End-of-Life Mobile Phone Components Based on Image Processing
Next Article in Special Issue
Pedagogical Design of K-12 Artificial Intelligence Education: A Systematic Review
Previous Article in Journal
Scenario Simulation for the Urban Carrying Capacity Based on System Dynamics Model in Shanghai, China
Previous Article in Special Issue
Immersive Disaster Training Schema Based on Team Role-Playing
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Computer-Based Scaffolding for Sustainable Project-Based Learning: Impact on High- and Low-Achieving Students

1
School of Education, City University of Macau, Macau, China
2
Department of Education Quality Monitoring and Evaluation, Zhongshan Teacher Development Center, Zhongshan 528403, China
3
College of Education for the Future, Beijing Normal University, Zhuhai 519087, China
4
School of the Science and Technology, Hong Kong Metropolitan University, Hong Kong, China
5
Faculty of Education, The University of Hong Kong, Hong Kong, China
6
Department of Educational Information Technology, East China Normal University, Shanghai 200062, China
*
Authors to whom correspondence should be addressed.
Sustainability 2022, 14(19), 12907; https://doi.org/10.3390/su141912907
Submission received: 30 July 2022 / Revised: 16 September 2022 / Accepted: 5 October 2022 / Published: 10 October 2022
(This article belongs to the Special Issue Educational Intelligence and Emerging Educational Technology)

Abstract

:
Project-based learning, in which students engage in meaningful learning with authentic projects and building agency and autonomy for sustainable learning, has been increasingly promoted in higher education. However, completing an authentic project involves a complex process, which may pose challenges to many students, especially low-achievers. This study incorporated computer-based scaffolding into a project-based programming course to make complex project learning accessible to students. The scaffolding was designed based on the four-component instructional design (4C/ID) model. The results show that with the support of computer-based scaffolding, all participants maintained a high level of motivation during the course. At the end of the course, their performance was improved by 35.49% in product quality and 38.98% in subject knowledge; their programming thinking skills were improved by 20.91% in problem understanding, 21.86% in modular design, and 25.70% in process design. Despite academic achievement discrepancies among the participants at the beginning of the course, low-achievers’ post-study performance in product quality and programming thinking skills became similar to that of high-achievers, and their post-study performance in subject knowledge became similar to that of medium-achievers. The findings reveal the promising role of computer-based scaffolding in making complex learning with real-world projects accomplishable by a wide range of students and reducing the gaps between high- and low-achieving students.

1. Introduction

To help students develop knowledge, skills, and resilience for survival in immensely complex and unpredictable situations, sustainable learning has been increasingly promoted [1]. Sustainable learning is more than the acquisition of knowledge and skills. It involves enabling learners to become more autonomous, develop agency over the learning process, and effectively engage in lifelong learning. Sustainable learning requires students to learn not only subject knowledge and skills, but also the process of learning itself (i.e., learning how to learn) to build agency, autonomy, and self-direction for lifelong learning [2,3].
In educational practice, students have been increasingly encouraged to take a high level of responsibility for their learning through student-centered pedagogies, such as project-based learning and problem-based learning [4,5]. In higher education, project-based learning (PjBL) has been widely promoted, especially in senior years of undergraduate studies, where students are encouraged to learn by working with authentic projects and creating tangible products or artifacts closer to professional reality [6]. The core idea of PjBL is to have students actively engage in meaningful learning with authentic projects or real-world problems to integrate knowing and doing and develop agency over the learning process [7,8].
Recent reviews of research in PjBL have revealed the promising benefits of PjBL in improving the understanding of abstract knowledge and the development of soft skills with respect to problem-solving, communication, decision making, and self-regulation [8,9]. However, there is concern that PjBL involves complex processes that require higher-order thinking skills, making it difficult to sustain effective learning with projects. While students in PjBL are expected to take a high level of responsibility for their learning, many students have difficulties completing a real-world project [4,7,10,11,12,13]. The challenge can be more serious for low-achieving students, who often have inadequate skills to complete a project. To address the challenge, this study incorporated computer-based scaffolding in a project-based course delivered through an online system. We investigated whether and how the approach might benefit students of different levels of academic achievement (e.g., high-, medium-, and low-achievers) to sustain their PjBL.

1.1. Challenges in Implementing PjBL

Despite the advantages of PjBL in improving students’ knowledge and skills, recent studies on PjBL reported the difficulties in implementing PjBL [8,9]. Different from traditional education, PjBL involves a wide range of problem-solving activities and extensive hands-on practices. Students in PjBL need to produce solutions to nontrivial problems by investigating problems, exploring solutions, drawing conclusions, and creating artifacts in a variety of forms [9]. Many students have difficulties completing these complex activities without the necessary support; meanwhile, teachers reported to experience challenges in designing PjBL curricula and in supporting students during PjBL, for example assessing students’ progress, diagnosing their problems, and providing feedback to students [4,7,10,11,13,14]. As a result, PjBL is often not fully implemented in educational practices [9]. Besides, while students in PjBL are expected to achieve learning outcomes in multiple aspects, such as product quality, thinking skills, and subject knowledge, many studies adopting PjBL failed to demonstrate these outcomes [9].

1.2. High- and Low-Achieving Students in PjBL

Students in PjBL must go through a complex process to complete a real-world project. Completing the complex process requires higher-order thinking skills, such as understanding problems, exploring and refining ideas, designing plans, analyzing data, drawing conclusions, and implementing solutions. However, students often have inadequate knowledge and skills to complete the complex process, which is difficult to predefine since there is no single algorithm for completing complex problem-solving tasks [15,16,17]. This problem makes PjBL challenging to many students, especially low-achieving ones, who often have inadequate higher-order thinking skills required for complex problem-solving tasks.
Previous studies indicate that many students, especially low-achievers, have problems completing a real-world project in PjBL courses; many teachers found it difficult to teach problem-, project-, or inquiry-based curricula involving complex problem solving [7,18,19]). As claimed by many teachers, the tasks requiring higher-order thinking are appropriate mainly for high-achieving students, whereas low-achievers tend to have inadequate ability to complete such tasks [20]. Given inconclusive findings on the outcomes of adopting PjBL [7,9], more research is necessary to investigate whether and how students at different levels of academic achievement may benefit from PjBL.

1.3. Scaffolding in Learning with Complex Real-World Projects

In view of the complexity of working with real-world problem-solving tasks or projects, researchers highlight the importance of providing learners with necessary support to facilitate or scaffold learning in such contexts [15,16,21]. Commonly used approaches include using prompts or hints to bring learners’ attention to important issues of complex tasks, structuring or decomposing a complex task process into a set of main actions, and using key questions to help learners to recognize the important goals to pursue in the task [21,22,23,24,25].
Scaffolding aligns with the four-component instructional design (4C/ID) model, a framework for systematic learning with complex tasks [26]. Based on the 4C/ID model, the phases that a learner must go through to complete a complex problem-solving task should be made explicit to learners; moreover, learners should be provided with guidance, supportive information, or feedback to successfully complete each phase. Scaffolding also aligns with the cognitive apprenticeship model, which claims that performing a complex task involves an implicit process, and it is critical to make such the process visible for novices to observe, enact, and practice with expert help [27].
With the promotion of technology in educational practice, computer-based scaffolding integrated in computer-based learning environments has been increasingly explored to support student learning with complex problem-solving tasks. Research shows that computer-based scaffolding has been mostly used in science education and STEM education contexts and has played an important role in improving students’ higher-order thinking skills and learning outcomes [28,29]. While computer-based scaffolding has the potential to support student learning in PjBL contexts, it remains unknown whether and how computer-based scaffolding may benefit students of different levels of academic achievement (e.g., high-, medium-, and low-achievers), who may differ in their ability to complete the tasks requiring higher-order thinking skills.

1.4. Affective Experiences in PjBL

Compared to traditional education, students are more likely to be motivated in PjBL, as they are encouraged to apply abstract knowledge to contextualized real-world projects [9]. However, there is concern that many students experience difficulties in PjBL which make them feel discouraged and frustrated and may influence their motivation and learning outcomes [7]. Prior research shows that affective experiences are closely intertwined with cognitive experiences [30,31]. If a learning task is too complex, students may have difficulties engaging in effective thinking; furthermore, they may feel frustrated and anxious, with a lack of confidence to persist in learning. Such negative affective experiences can impede cognitive processes, whereas positive affect can foster effective thinking and learning [32].
Affective experiences in educational contexts mainly consider motivational and emotional experiences. Regarding motivation, more attention is paid to intrinsic motivation, which can be measured in terms of interest/enjoyment, perceived competence, effort/importance, value/usefulness, pressure/tension, and perceived choices [33]. Among them, perceived competence appears to be particularly salient in self-regulated learning contexts, such as PjBL, since students’ competence may influence the degree to which they persist in learning when facing challenges. With respect to emotion, it encompasses learners’ positive and negative reactions to the learning experience. Enjoyment is a common positive emotion, whereas anxiety and tension are typical negative emotions [32,34]; both of them are related to intrinsic motivation. In this study, we included these key elements to analyze students’ affective experiences in PjBL. Although PjBL can stimulate students’ motivation, it’s unclear whether the motivation can be sustained in learning with complex projects [14].

1.5. Summary of Existing Studies

The review of the literature shows that PjBL is increasingly promoted in higher education to help students to connect knowledge with practice and improve their knowledge and skills by working with meaningful projects. PjBL also enables students to build agency and autonomy for sustainable learning in response to complex dynamic environments. Despite the advantages of PjBL in stimulating motivation and improving learning, there is concern that PjBL involves a complex process, making it a challenge for many students, especially low-achievers, to sustain effective learning with projects. While computer-based scaffolding has the potential to support student learning with real-world problem-solving projects or tasks, it remains unknown whether and how computer-based scaffolding may benefit a wide range of students of different levels of academic achievement. To address the gap, this study investigated whether and how computer-based scaffolding can support PjBL among students of different levels of academic achievement.

1.6. The Present Study

PjBL has been promoted in educational practice across multiple levels (e.g., primary, secondary, and tertiary) and in various subject areas, including engineering, science, mathematics, and social science [8,19,35,36,37]). This study focused on PjBL of computer programming, an important subject in engineering. PjBL has been increasingly adopted in engineering education and is regarded as among the most suitable means of developing students’ professional competencies in response to society’s demands on engineering professionals [38,39,40]. In programming courses, PjBL can help students master abstract programming knowledge by applying it to real-world programming projects, mainly through developing computer programs [19].
Considering that many students have problems going through the process of completing a real-world project, this study incorporated computer-based scaffolding in a project-based programming course delivered through an online system. While computer-based scaffolding has the potential to support student learning with real-world problem-solving projects or tasks, it remains unknown whether and how computer-based scaffolding may benefit a wide range of students of different levels of academic achievement. In this study, we investigated whether and how the approach might benefit students of different levels of academic achievement to sustain their PjBL. Based on the literature, the levels of academic achievement among students are often determined based on knowledge test scores [20]. Students in this study were categorized into high-, medium-, and low-achieving groups according to their pre-study knowledge test scores. The learning outcomes were examined in multiple aspects including product quality, thinking skills, subject knowledge, and affective experiences.
The research questions (RQ) to be addressed in the study include:
  • RQ1: What are the academic achievement and affective experiences acquired by students from the PjBL course with computer-based scaffolding?
  • RQ2: Do high-, medium-, and low-achieving students differ in their academic achievement and affective experiences acquired from the PjBL course with computer-based scaffolding? If so, what are the differences?
The findings of the study may contribute to the literature regarding the role of computer-based scaffolding in learning with real-world problem-solving projects or tasks. While computer-based scaffolding has the potential to support student learning with real-world problem-solving projects or tasks, it remains unknown whether and how computer-based scaffolding may benefit a wide range of students of different levels of academic achievement. The findings of this study can contribute to the literature by revealing the role of computer-based scaffolding in facilitating PjBL—in particular, whether it can make complex PjBL accessible to a wide range of students and whether it can reduce the gaps between high- and low-achieving students.

2. Methods

2.1. Participants

The study was conducted with senior year students from an ordinary university in southern China. The study received the ethical approval from the Human Research Ethics Committee of the researchers’ university. Students gave informed consent to participate in this study. The participants were 69 Year 3 computer science students (43 males and 26 females) from the university. Their average age was 21.1 years. The participants were offered a 6-week project-based course using an online system, which aimed to help senior year students to develop authentic programming skills. The programming language ASP.NET was selected as the learning subject of this course since it is a popular programming language for developing dynamic modern web applications and services. Before the study, the participants had obtained basic programming knowledge and online learning experience from previous courses. All the participants completed the learning activities requested for the project-based course.

2.2. Measures and Instruments

Students’ performance in this study was assessed based on the programming assessment model proposed by [41]. The assessment considers two dimensions: program code and programming process. The former represents the solution or product, and the latter reflects higher-order thinking involved in the design and development of the program. Accordingly, students’ programming performance in this study was examined in terms of product quality and programming thinking skills, as elaborated below.
Product quality. The quality of student-generated products (program codes) was assessed through a programming task before and after the study. The two tasks, one for pre-test and the other for post-test, were designed to be practical and moderately difficult. The two tasks were different in content but at the same level of difficulty as determined by two domain experts (an experienced programming teacher, and an industry expert in computer programming). Based on the aforementioned model proposed by [41], the program codes were assessed in terms of correctness, efficiency, reliability, and readability. The scores ranged from 0 to 10 for each of the four subscales, and the highest possible score for the product quality was 40 points. The rubrics for assessing program codes are provided in Appendix A.
Programming thinking skills. In each of the two programming tasks, students were required to submit a problem statement, modular design diagram, and program flowchart to present their thinking when they worked on the task. These submissions were used to assess students’ programming thinking skills in terms of (a) problem understanding, (b) solution planning (i.e., modular design), and (c) solution design (i.e., process design). The highest possible score was 20 for each of the three subscales. The assessment rubrics are presented in Appendix B.
Knowledge tests. In addition to programming performance, students’ programming knowledge was assessed before and after the study. The pre- and post-tests used the same types of questions including single-choice questions, fill-in-the-blanks questions, and short program-writing. The two tests had the same level of difficulty as determined by the two domain experts mentioned above. The highest possible score on each test was 100 points.
Affective experiences. Students’ affective experiences in PjBL were assessed at the end of the study through a questionnaire survey. The survey was designed based on the intrinsic motivation inventory model [33], which to assess students’ motivational and emotional experiences in terms of interest/enjoyment, perceived competence, effort/importance, value/usefulness, and pressure/tension. The questionnaire used a 5-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree). Examples of the items include “I enjoyed attending this project-based programming course”; “I felt confident during the learning of this course”; “I put a lot of effort into this course”; “I think this course is useful”; and “I got nervous while studying in this course”. Cronbach’s alpha values (0.90 for interest/enjoyment, 0.88 for effort/importance, 0.80 for value/usefulness, 0.89 for perceived competence, and 0.93 for pressure/tension) on internal consistency confirm the reliability of the sub-scales.
Student comments. To better understand students’ learning experience beyond those reflected in the survey, students’ comments on the PjBL course were collected. The participants were asked to give written responses to two open-ended questions: (1) What are your views on the advantages of the course? (2) What are your views on the weaknesses of the course?”

2.3. Learning Task

During the PjBL course, students were asked to complete an authentic programming project—membership management. In this project, students were requested to develop a computer program that can be used for member registration, password setting and resetting, user login, login validation, and updating member information. During the course, students were guided through a set of phases, including problem understanding, solution planning (modular design), solution design (process design), solution implementation (coding), and evaluation and reflection, to complete the project. In each phase, students generated relevant learning artifacts, such as problem statements, modular design diagrams, program flowcharts, and program code.

2.4. Online Learning Environment with Computer-Based Scaffolding

Students participated in the PjBL course delivered with the support of an online learning system. The system incorporated computer-based cognitive scaffolding to make the complex process involved in PjBL visible to students. The computer-based cognitive scaffolding was designed based on the four-component instructional design (4C/ID) model, a conceptual model for systematic learning with complex tasks [26]. Based on this model, the system specifies the key phases of the process a learner must go through to complete a programming project and the rules of thumb or heuristics that might help the learner successfully complete each phase.
The online learning system was implemented in a browser/server architecture. Front-end coding languages (HTML, CSS, and JavaScript) were used to construct the client-side applications. C#, a class-based, object-oriented programming language was used to establish the data processing logic and server-side web services. Microsoft SQL Server, a database management system, was used to store, retrieve, and manipulate the data in the system.
Figure 1 presents some screenshots of the online system. The general process of completing a programming project is outlined on the main page of the system. It makes explicit the complex process that a learner must go through to complete a programming project. The process includes several phases: problem understanding, solution planning (modular design), solution design (process design), solution implementation (coding), and evaluation and reflection. Presenting the process in a visual form allowed students to have a holistic view of the complex process for PjBL. As shown in the upper-left part of Figure 1, when the mouse is moved around the icon of a specific phase of the process, students can view the basic rules for completing the phase. Students could click on an icon open a page to perform relevant activities to complete the phase.
Problem understanding. In Phase 1, students were guided to formulate a problem statement for a clear understanding of the problem. Relevant heuristics were also presented in the system. For example, a structured form was provided for students to formulate the problem statement by specifying the project requirements and project goals.
Solution planning (modular design). In Phase 2, students were requested to generate a solution plan based on their understanding of the project requirements and goals. They were guided to produce a solution plan by proposing a set of functional modules and specifying the relationships between the modules. The learning system provided relevant guidance and a diagramming tool for students to outline the modular design in a modular block diagram.
Solution design (process design). In Phase 3, students were guided to generate a detailed design of the solution based on the modular design. They were given relevant guidance and a diagramming tool to design the solution by building a program flowchart demonstrating the solution process within and across the functional modules.
Solution implementation (coding). In Phase 4, students were requested to translate the modular design and process design into an executable program by writing source code in ASP.NET, a programming language. Students could submit their programs using an online coding tool in the system and modify their programs throughout the project.
Evaluation and reflection. In Phase 5, students were asked to evaluate their programs by testing and debugging their codes. In addition to the program code, they could also review and modify their artifacts (e.g., modular block diagram, flowchart) generated in previous phases. In the meantime, students could receive the teacher’s feedback to improve their artifacts.

2.5. Procedure

The participants were offered a 6-week PjBL course, as shown in Figure 2. The course was delivered as an online course, which is crucial to support student learning during the COVID-19 pandemic [42,43]. The duration and learning activities of the course were proposed based on a similar PjBL course presented in a prior study [11,12].
In Week 1, the participants completed a short survey in 5 min. Next, they were given 1 h of face-to-face instruction on how to use the online learning system and computer-based scaffolding tool to complete a programming project. Relevant information and guidance were also available in the system for flexible access. During the instruction, a sample project was used for demonstration by the teacher. Students could use the sample project to practice and become familiar with the learning environment. After, the students completed a knowledge test in one hour and completed a programming task in another hour to assess their programming knowledge and programming performance before the study.
From Week 2 to Week 5, students performed independent learning with a programming project—membership management. They were asked to learn at their own pace and spend about 2 h per day or 10 h per week on the project based on the workload of the project. In the meantime, the teacher monitored students’ progress and provided comments and feedback on their artifacts via the online system. Considering the complexity of PjBL, two face-to-face consultations were arranged in Week 3 and Week 5, respectively. Each consultation session lasted for one hour.
In Week 6, students completed the post-study knowledge test in one hour and completed a programming task in another hour. In addition, a questionnaire survey was administered in 20 min to collect students’ affective experiences and their comments on the PjBL course.

2.6. Data Analysis

The SPSS 27 statistical program was used to analyze the quantitative data and generate the graphs. The collected data were analyzed as follows. First, two programming experts (the experienced teacher and the industry expert) graded the students’ knowledge tests, programming thinking skills, and program codes blindly and independently before and after the study, and their scores were averaged. The inter-rater reliability measured using Cohen’s Kappa ranged from 0.820 to 0.900 (see Table 1), suggesting a high level of agreement.
Second, paired-sample t-tests were conducted to compare the differences in students’ knowledge test scores, product quality, and thinking skills before and after study.
Third, after students were categorized into high-, medium-, and low-achieving groups according to pre-study knowledge test scores, a set of one-way ANOVAs was conducted to evaluate the differences among the three groups in terms of subject knowledge, product quality, thinking skills, gain scores (i.e., the difference between pre- and post-test scores), and affective experiences. When a significant difference was detected, Scheffe’s post hoc analysis was conducted to compare the difference between each pair of the three groups.
Fourth, a thematic content analysis was performed to probe common themes in students’ comments, i.e., responses to the open-ended question. The analysis followed an iterative process of code and theme generation in a bottom-up manner. The first author and a trained researcher coded 30% of the response data under each of the two questions. Student responses were categorized into a set of themes. Discrepancies between the two coders in the themes emerging from the responses were discussed between the two coders and reconciled by further consultation of the data. After consensus was reached, the two coders independently coded 20% of the dataset, and the inter-coder agreement of the coding results was 0.98. After the differences in their coding results were discussed and resolved, all the response data were then coded by the first author based on the confirmed coding framework.

3. Results

3.1. Learning Outcomes and Affective Experiences of All Participants

Table 2 presents the descriptive statistics of students’ pre- and post-study knowledge test scores, product quality, thinking skills, and affective experiences.
The results of students’ test scores and product quality are presented in Table 3. The paired sample t-tests reveal that students’ subject knowledge (t (68) = −10.39, p < 0.001, d = −1.25) and product quality (t (68) = −6.20, p < 0.001, d = −0.75) significantly improved after completing the PjBL course, with their average scores increasing by 38.98% and 35.49%, respectively.
The results of students’ programming skills are presented in Table 4. At the end of the course, students’ programming thinking skills were significantly improved in terms of problem understanding (t (68) = −5.91, p < 0.001, d = −0.71), modular design (t (68) = −6.44, p < 0.001, d = −0.78), and process design (t (68) = −7.75, p < 0.001, d = −0.93), with their average scores increased by 20.91%, 21.86%, and 25.70%, respectively.
Table 5 presents the descriptive statistics of students’ affective experiences during the PjBL course, showing that students had positive experiences in terms of interest/enjoyment (Mean = 3.97, SD = 0.66), competence (Mean = 3.86, SD = 0.69), effort/importance (Mean = 3.95, SD = 0.72), and value/usefulness (Mean = 4.22, SD = 0.46), while they perceived a slightly low level of pressure/tension (Mean = 2.84, SD = 1.12).

3.2. Differences in Learning Outcomes between High-, Medium-, and Low-Achievers

Students were categorized into high-, medium-, or low-achieving groups according to their pre-study knowledge test scores. The cut-off points were determined according to the 27 percent rule from the extreme group approach [44,45]. The students with scores in the top 27% were assigned to the high-achieving group (n = 19), those in the bottom 27% to the low-achieving group (n = 19), and the rest to the medium-achieving group (n = 31).
Subject knowledge. Figure 3 shows the knowledge test performance of the three groups (high, medium, and low achievers) based on the average scores of each group before and after the study. The ANOVA results presented in Table 6 indicate that there existed differences between the three groups in their pre-test scores (F (2, 66) = 171.88, p < 0.001) and post-test scores (F (2, 66) = 41.46, p < 0.001). Furthermore, the Scheffe’s post hoc comparison reveals that the knowledge gap between the low- and medium-achieving groups was not significant in the post-test. Consistently, the ANOVA result of students’ gain scores in subject knowledge (F (2, 66) = 9.61, p < 0.001) shows that the low-achieving group made more progress in subject knowledge than the other two groups.
Product quality. Figure 4 presents the performance in product quality of the three groups (high-, medium-, and low-achievers) based on the average scores of each group before and after the study. The ANOVA results (see Table 7) indicate that there existed differences between high-, medium-, and low-achieving students in their product quality at the beginning (F (2, 66) = 10.82, p < 0.001) and the end of the study (F (2, 66) = 10.82, p < 0.001). Scheffe’s post hoc analysis shows that the high-achieving group’s product quality was better than that of other two groups in the pre-test; however, the discrepancies between the high- and low-achieving groups were not significant in the post-test. The ANOVA result for students’ gain scores (F (2, 66) = 4.40, p = 0.016) indicates that low achievers made more progress than medium achievers in their product quality.
Thinking skills. Figure 5 presents three groups’ performances in programming thinking skills in terms of problem understanding, modular design, and process design based on the average scores of each group. The ANOVA results shown in Table 8 indicate that high-, medium-, and low-achieving students differed significantly in all three dimensions of thinking skills at the beginning of the study (F (2, 66) = 10.02, p < 0.001; F (2, 66) = 8.83, p < 0.001; F (2, 66) = 14.41, p < 0.001). Although the discrepancies among the three groups still existed at the end of the study (F (2, 66) = 6.46, p = 0.003; F (2, 66) = 7.44, p = 0.001; F (2, 66) = 8.66, p < 0.001), the Scheffe’s post hoc analysis shows that the differences in three dimensions of programming thinking skills between high-and low-achievers were not significant at the end of the study. Consistently, the ANOVA results of gain scores (F (2, 66) = 7.56, p = 0.001; F (2, 66) = 7.26, p = 0.001; F (2, 66) = 10.33, p < 0.001) reveal that compared to other two groups, the low-achieving group made more improvement in three dimensions of programming thinking skills.

3.3. Differences in Affective Experiences between High-, Medium-, and Low-Achievers

As shown in Table 9, the ANOVA results indicate that students of the three groups had similar affective experiences (i.e., interest/enjoyment (F (2, 66) = 0.22, p > 0.05), perceived competence (F (2, 66) = 1.78, p > 0.05), effort/importance (F (2, 66) = 3.01, p > 0.05), value/usefulness (F (2, 66) = 1.39, p > 0.05), and pressure/tension (F (2, 66) = 1.76, p > 0.05).

3.4. Student Comments on the PjBL Course

Students’ comments on the PjBL course were analyzed based on their responses to the two open-ended survey questions. Table 10 presents the results regarding the advantages of the PjBL course as commented by the students. The results demonstrate the themes in students’ responses, illustrative examples, and the frequency of each theme. Generally, most of the participating students felt the PjBL course to be effective in facilitating self-regulated learning (64%), developing problem-solving ability (54%), and providing scaffolding for learning (51%). Some students mentioned that the PjBL course was helpful in supporting knowledge acquisition (42%), promoting knowledge–practice integration (33%), and stimulating the motivation to learn programming (32%). Within them, more low-achieving students made positive comments related to self-regulated learning, knowledge acquisition, knowledge–practice integration, and learning resources; more high- and medium-achieving students commented on the advantage related to reflective learning.
Regarding students’ views on the weaknesses of the PjBL course, Table 11 presents the results based on the analysis of student responses. Most students mentioned technical problems of the learning system (42%) and insufficient interaction with the teacher during the study (25%). Some students mentioned the learning difficulties (22%), the lack of sufficient learning time (17%), and the lack of supervision from the teacher (13%) during the study. Among them, more low-achieving students, compared to high-achievers, commented on the weakness regarding inadequate teacher–student interactions during the study.

4. Discussion

4.1. Learning Outcomes, Affective Experiences, and Comments of All Participants

The assessment results show that students’ academic performance in subject knowledge, product quality, and thinking skills were significantly enhanced at the end of the PjBL course. The findings add evidence to the efficacy of computer-based scaffolding in supporting student thinking and learning with complex real-world projects and problem-solving tasks [22,46,47]. In this study, by integrating computer-based scaffolding in the online learning environment, the students could follow the scaffold to self-regulate their learning, e.g., organizing thoughts, developing artifacts, and reflecting on their performance in each stage of the project. Meanwhile, the scaffold allowed the teacher to observe students’ progress and performance in each stage and provide adaptive feedback during the course. The findings also echo previous research findings that suggest making the complex cognitive process accessible to students is crucial to effective learning in programming courses [11,48]. In this study, the PjBL process displayed on the home page of the learning system allowed students to have a clear view of the complex process that they need go through to complete a programming project. Given such support, they could engage in effective thinking and sustain self-regulated learning with a complex project and achieve desirable progress at the end of the study.
With respect to students’ affective experiences, the survey results show that the participants perceived high levels of interest, competence, effort, and value as well as a low level of tension during the learning process. The findings are in line with prior research, which suggests that scaffolding a complex task by making the complex process accessible to students can enhance students’ motivation and enjoyment in learning with complex tasks by increasing self-efficacy and reducing anxiety [47,49,50,51]. Although prior research indicates that students can be highly motivated in PjBL courses which encourage them to apply abstract knowledge to contextualized real-world projects [9], many students experience difficulties in PjBL courses. The problems may make students feel discouraged and frustrated and may influence their motivation and learning outcomes [7]. The findings of the study reveal that students engaged in PjBL with the support of computer-based scaffolding may sustain their motivation in learning with a complex project.
The above findings on students’ academic achievement and affective experiences are also supported by students’ comments on the PjBL course. Most of the participants felt the PjBL course to be effective in facilitating self-regulated learning, developing problem-solving ability, and providing scaffolding for learning. Some students mentioned that the course was helpful in supporting knowledge acquisition, promoting knowledge–practice integration, and stimulating their motivation to learn programming.

4.2. Differences in Learning Outcomes between High-, Medium-, and Low-Achieving Students

The results uncovered how the three groups of students (high-, medium-, and low-achievers) benefited differently from the PjBL course with computer-based scaffolding. In terms of subject knowledge, the three groups’ performances differed at the beginning of the study, but the gap between low and medium achievers almost disappeared at the end of the study. The findings suggest that the PjBL course with the support of computer-based scaffolding enables low achievers to consolidate their subject knowledge and make considerable progress in the knowledge test. They made more progress in subject knowledge than the other two groups.
With respect to product quality, high-achieving students performed better than low- and medium-achieving students at the beginning, but the discrepancy between the high- and low-achieving groups almost disappeared at the end of the study. The gain scores in product quality indicated that low-achieving students made more progress than medium-achieving students.
The findings regarding programming thinking skills are similar to those on product quality. Despite significant differences in thinking skills among high-, medium-, and low-achieving students at the beginning of the study, there were no significant differences between low- and high-achieving students in three dimensions of programming thinking skills (i.e., problem understanding, modular design, and process design) at the end of the study. The low-achieving group made more improvement in problem understanding and process design than the other two groups.
On the contrary to the belief that academically weaker students are often not prepared for programming projects [18], this study demonstrates that low-achieving students with the support of computer-based scaffolding can perform as well as high-achieving students in their programming thinking skills and product quality. It is possible that weak students often lack high-level cognitive and metacognitive skills [18,52], and scaffolding for complex thinking and learning can effectively engage weak students in complex tasks and make them benefit more from the learning program [53,54](). However, it should be noted that the medium-achieving students gained less in this study, which requires further research to investigate the reasons.

4.3. Differences in Affective Experiences and Comments between High-, Medium-, and Low-Achieving Students

Regarding affective experiences, there were no significant differences between the high-, medium-, and low-achieving students in terms of interest/enjoyment, perceived competence, effort/importance, value/usefulness, and pressure/tension at the end of the study. The results suggest that the PjBL course with computer-based scaffolding could sustain students’ motivation for learning programming.
Students’ comments on the advantages and weaknesses of the PjBL course were slightly different between low- and high-achievers. In particular, more low-achieving students gave positive comments related to self-regulated learning, knowledge acquisition, and knowledge-practice integration, which were consistent with their larger progress in academic achievement compared to medium-achievers. Results of the study echo the findings of previous research that scaffolding learning with complex tasks could improve students’ task performance [22,46,55] and reduce their anxiety and frustration while working with complex tasks [47,49,50,51]. It was possible that the proposed scaffolding in this study was more helpful for low-achieving students to develop complex programming knowledge and skills.
Meanwhile, it is also noteworthy that some low-achieving students mentioned inadequate interaction with the teacher during the PjBL course. The results suggest that PjBL without prompt guidance from the teacher is still a challenge for some students. Previous research also indicated that students in PjBL require self-regulated learning skills [56], which might be another challenge for low-achieving students. Future studies on PjBL may need to take this issue into account by offering adequate support to students—for example, meeting with teachers or receiving voice or video feedback from teachers during the learning process.

5. Conclusions

PjBL has been increasingly promoted in higher education to enable learners to improve knowledge and skills by working with meaningful projects and to build agency and autonomy for sustainable learning in response to complex dynamic environments. Despite the advantages of PjBL in stimulating motivation and improving learning, there is concern that PjBL involves a complex process which is difficult to complete for many students, especially low achievers. This study incorporated computer-based scaffolding in a project-based programming course to make complex PjBL accessible to students. The computer-based scaffolding made explicit the complex process of completing a realistic programming project and provided the heuristics for completing each phase of the project. It helped students to capture the complex process of learning with real-world projects (i.e., learning how to learn), which is crucial to building agency, autonomy, and self-direction for sustainable learning.
The results show that with the support of computer-based scaffolding, all participating students maintained a high level of motivation in the PjBL course; their performances in subject knowledge, product quality, and programming thinking skills were significantly improved at the end of the course. Despite academic achievement discrepancies among the students before the course, low achievers’ post-study performance in product quality and programming thinking skills became similar to that of high achievers, and their performance in subject knowledge became similar to that of medium achievers at the end of the course. The participating students commented on the benefits of computer-based scaffolding in facilitating self-regulated learning, developing problem-solving ability, and providing scaffolding for complex learning.
Our findings have several implications for research and practice in sustaining student learning with real-world complex problem-solving projects or tasks. First, while students are encouraged to take a high level of responsibility in learning with real-world problem-solving projects or tasks, they often face substantial challenges in accomplishing the complex process. It is thus important to make the complex process accessible to novice learners, enabling them to build agency and autonomy for sustainable learning. Second, such support is more crucial to low-achieving students, who often have inadequate higher-order thinking skills to accomplish the complex process. Providing such support has the potential to reduce the disadvantage of low-achievers and make complex PjBL accomplishable by a wide range of students. Third, although PjBL can stimulate students’ motivation to learn, it is important to sustain students’ motivation, especially when they face challenges in working with complex projects or tasks. Effective design of computer-based support can play a role in facilitating complex learning, sustaining student motivation, and improving learning outcomes. Fourth, in learning with real-world problem-solving projects, students are expected to improve higher-order thinking skills and problem-solving abilities. Future studies should pay more attention to assessing students’ thinking skills and problem-solving abilities to demonstrate a full picture of project-based learning outcomes.
This study has several limitations. First, the study adopted a one-group design without a control group, which may limit the generalizability of the findings. A quasi-experimental design can be used to further investigate the effects of computer-based scaffolding in PjBL. Second, the participants of this study were from one university, which may constrain the generalization of the findings to some extent. Third, students mentioned some weaknesses of the online learning system and the PjBL course, which may affect their learning outcomes. Future studies will refine the learning system and improve the course design and implementation—for example, by enhancing teacher–student interactions during the study.

Author Contributions

Conceptualization, J.P.; software, J.P. and B.Y.; data curation, J.P. and B.Y.; formal analysis, J.P.; validation, M.S. and M.W.; writing—original draft preparation, J.P., M.S. and M.J.; writing—review and editing, M.S. and M.W.; supervision, M.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the 2021 Higher Education Fund of the Macao SAR Government (Number: HSS-CITYU-2021-07), National Natural Science Foundation of China (No. 61977023) and the Eastern Scholar Chair Professorship Fund (No. JZ2017005) from the Shanghai Municipal Education Commission of China.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and approved by the Human Research Ethics Committee for Non-Clinical Faculties of the University of Hong Kong (No. 84080115, 11 Jan 2015).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The datasets generated in this study are not publicly available due to the ethical requirements but are available from the corresponding author upon reasonable request.

Acknowledgments

The authors thank Haijing Jiang for his valuable support for this study.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

4C/IDFour-component instructional design
ANOVAAnalysis of variance
CSSCascading style sheets
HTMLHyperText markup language
PjBLProject-based learning
SDStandard deviation
SQLStructured query language
STEMScience, technology, engineering, and mathematics

Appendix A

Table A1. Rubrics for assessing product quality.
Table A1. Rubrics for assessing product quality.
AspectDescription *Score Range
Correctness10—Correct solution specifications/program code and results consistent with problem requirements.
5—Partial solution specifications/program code and/or some results.
0—No solution specifications/program code or results inconsistent with problem requirements.
0 to 10
Efficiency10—Most algorithms, data structures, control structures, and language constructs are appropriate.
5—Program accomplishes its task but lacks coherence in choice of either data and/or control structures.
0—Program solution lacks coherence in choice of both data and control structures.
0 to 10
Reliability10—Program functions properly under all test cases. Works for and responds to all valid inputs.
5—Program functions under limited test cases. Only works for valid inputs but fails to respond to invalid inputs.
0—Program fails under most test cases.
0 to 10
Readability10—Program code includes clear documentation (comments, meaningful identifiers, indentation to clarify logical structure) and user instructions.
5—Program code lacks clear documentation and/or user instructions.
0—Program code is totally incoherent.
0 to 10
* Scores of 1–4 and 6–9 were assigned when the quality of solution was assessed to be between the major units 0, 5, and 10.

Appendix B

Table A2. Rubrics for assessing programming thinking skills.
Table A2. Rubrics for assessing programming thinking skills.
AspectDescription *Score Range
Problem understanding Project requirements10—Project requirements are clearly and correctly stated. All elements are identified.
5—Project requirements are partially stated. Some elements are not identified. Some statements are incorrect or irrelevant.
0—No relevant project requirements are identified.
0 to 10
Project goals10—Project goals are clearly and correctly stated. All elements are identified.
5—Project goals are partially stated. Some elements are not identified. Some statements are incorrect or irrelevant.
0—No relevant project goals are identified.
0 to 10
Modular designFunctional modules10—Detailed and clear planning of the solution, with complete and appropriate functional modules.
5—Partially correct planning of the solution, with incomplete or inappropriate functional modules.
0—No appropriate or relevant modules proposed for the solution plan.
0 to 10
Relationships between functional modules10—Detailed and appropriate relationships between functional modules.
5—Partially correct or incomplete relationships between functional modules.
0—No or completely inappropriate relationships between functional modules.
0 to 10
Process designModule decomposition10—Complete and appropriate decomposition of functional modules in the process design.
5—Partially correct or incomplete decomposition of functional modules in the process design.
0—No or completely inappropriate decomposition of functional modules in the process design.
0 to 10
Process organization10—Complete and appropriate organization of the process.
5—Partially correct or incomplete organization of the process.
0—No or completely inappropriate organization of the process.
0 to 10
* Scores of 1–4 and 6–9 were assigned when the quality of solution was assessed to be between the major units 0, 5, and 10.

References

  1. Hays, J.; Reinders, H. Sustainable learning and education: A curriculum for the future. Int. Rev. Educ. 2020, 66, 29–52. [Google Scholar] [CrossRef] [Green Version]
  2. Caena, F.; Stringher, C. Towards a New Conceptualization of Learning to Learn. Aula Abierta 2020, 49, 199–216. [Google Scholar] [CrossRef]
  3. Black, P.; McCormick, R.; James, M.; Pedder, D. Learning how to learn and assessment for learning: A theoretical inquiry. Res. Pap. Educ. 2006, 21, 119–132. [Google Scholar] [CrossRef]
  4. English, M.C.; Kitsantas, A. Supporting student self-regulated learning in problem-and project-based learning. Interdiscip. J. Probl.-Based Learn. 2013, 7, 6. [Google Scholar] [CrossRef] [Green Version]
  5. Wright, G.B. Student-Centered Learning in Higher Education. Int. J. Teach. Learn. High. Educ. 2011, 23, 92–97. [Google Scholar]
  6. Jollands, M.; Jolly, L.; Molyneaux, T. Project-based learning as a contributing factor to graduates’ work readiness. Eur. J. Eng. Educ. 2012, 37, 143–154. [Google Scholar] [CrossRef]
  7. Blumenfeld, P.C.; Soloway, E.; Marx, R.W.; Krajcik, J.S.; Guzdial, M.; Palincsar, A. Motivating project-based learning: Sustaining the doing, supporting the learning. Educ. Psychol. 1991, 26, 369–398. [Google Scholar] [CrossRef]
  8. Chen, C.-H.; Yang, Y.-C. Revisiting the effects of project-based learning on students’ academic achievement: A meta-analysis investigating moderators. Educ. Res. Rev. 2019, 26, 71–81. [Google Scholar] [CrossRef]
  9. Guo, P.; Saab, N.; Post, L.S.; Admiraal, W. A review of project-based learning in higher education: Student outcomes and measures. Int. J. Educ. Res. 2020, 102, 101586. [Google Scholar] [CrossRef]
  10. Helle, L.; Tynjälä, P.; Olkinuora, E. Project-based learning in post-secondary education–theory, practice and rubber sling shots. High. Educ. 2006, 51, 287–314. [Google Scholar] [CrossRef]
  11. Peng, J.; Wang, M.; Sampson, D.; van Merrienboer, J. Using a visualization-based and progressive learning environment as a cognitive tool for learning computer programming. Australas. J. Educ. Technol. 2019, 35, 52–68. [Google Scholar] [CrossRef]
  12. Peng, J.; Wang, M.; Sampson, D. Visualizing the Complex Process for Deep Learning with an Authentic Programming Project. Educ. Technol. Soc. 2017, 20, 275–287. [Google Scholar]
  13. Thomas, J.W. A Review of Research on Project-Based Learning; Autodesk Foundation: San Rafael, CA, USA, 2000. [Google Scholar]
  14. Hakamian, H.; Sobhiyah, M.H.; Aghdasi, M.; Shamizanjani, M. How can the Portuguese navigation system in the 15th century inspire the development of the model for project-based learning organizations? Knowl. Manag. E-Learn. 2019, 11, 59–80. [Google Scholar] [CrossRef]
  15. Hmelo-Silver, C.E.; Duncan, R.G.; Chinn, C.A. Scaffolding and achievement in problem-based and inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educ. Psychol. 2007, 42, 99–107. [Google Scholar] [CrossRef]
  16. Kirschner, P.A.; Sweller, J.; Clark, R.E. Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educ. Psychol. 2006, 41, 75–86. [Google Scholar] [CrossRef]
  17. Sasson, I.; Yehuda, I.; Malkinson, N. Fostering the skills of critical thinking and question-posing in a project-based learning environment. Think. Ski. Creat. 2018, 29, 203–212. [Google Scholar] [CrossRef]
  18. Jazayeri, M. Combining mastery learning with project-based learning in a first programming course: An experience report. In Proceedings of the 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering, Florence, Italy, 16–24 May 2015; Volume 2, pp. 315–318. [Google Scholar]
  19. Pucher, R.; Lehner, M. Project based learning in computer science–A review of more than 500 projects. Procedia-Soc. Behav. Sci. 2011, 29, 1561–1566. [Google Scholar] [CrossRef] [Green Version]
  20. Zohar, A.; Degani, A.; Vaaknin, E. Teachers’ beliefs about low-achieving students and higher order thinking. Teach. Teach. Educ. 2001, 17, 469–485. [Google Scholar] [CrossRef]
  21. Belland, B.R.; Walker, A.E.; Kim, N.J.; Lefler, M. Synthesizing results from empirical research on computer-based scaffolding in STEM education: A Meta-analysis. Rev. Educ. Res. 2016, 87, 309–344. [Google Scholar] [CrossRef] [Green Version]
  22. Gijlers, H.; de Jong, T. Using concept maps to facilitate collaborative simulation-based inquiry learning. J. Learn. Sci. 2013, 22, 340–374. [Google Scholar] [CrossRef]
  23. Lazonder, A.W.; Harmsen, R. Meta-analysis of inquiry-based learning: Effects of guidance. Rev. Educ. Res. 2016, 86, 681–718. [Google Scholar] [CrossRef]
  24. Reiser, B.J. Scaffolding complex learning: The Mechanisms of structuring and problematizing student work. J. Learn. Sci. 2004, 13, 273–304. [Google Scholar] [CrossRef]
  25. Hooshyar, D.; Yousefi, M.; Wang, M.; Lim, H. A data-driven procedural-content-generation approach for educational games. J Comput Assist Learn. 2018, 34, 731–739. [Google Scholar] [CrossRef]
  26. Van Merriënboer, J.J.G.; Kirschner, P.A. Ten Steps to Complex Learning: A Systematic Approach to Four-Component Instructional Design, 3rd ed.; Routledge: New York, NY, USA, 2018. [Google Scholar] [CrossRef]
  27. Collins, A.; Brown, J.S.; Holum, A. Cognitive apprenticeship: Making thinking visible. Am. Educ. 1991, 15, 6–11. [Google Scholar]
  28. Kim, N.J.; Belland, B.R.; Walker, A.E. Effectiveness of computer-based scaffolding in the context of problem-based learning for STEM education: Bayesian meta-analysis. Educ. Psychol. Rev. 2018, 30, 397–429. [Google Scholar] [CrossRef] [Green Version]
  29. Devolder, A.; Van Braak, J.; Tondeur, J. Supporting self-regulated learning in computer-based learning environments: Systematic review of effects of scaffolding in the domain of science education. J. Comput. Assist. Learn. 2012, 28, 557–573. [Google Scholar] [CrossRef]
  30. Phelps, E.A. Emotion and cognition: Insights from studies of the human amygdala. Annu. Rev. Psychol. 2006, 57, 27–53. [Google Scholar] [CrossRef] [Green Version]
  31. Schutz, P.A.; DeCuir, J.T. Inquiry on emotions in education. Educ. Psychol. 2002, 37, 125–134. [Google Scholar] [CrossRef]
  32. Pekrun, R.; Goetz, T.; Frenzel, A.C.; Barchfeld, P.; Perry, R.P. Measuring emotions in students’ learning and performance: The Achievement Emotions Questionnaire (AEQ). Contemp. Educ. Psychol. 2011, 36, 36–48. [Google Scholar] [CrossRef] [Green Version]
  33. McAuley, E.; Duncan, T.; Tammen, V.V. Psychometric properties of the Intrinsic Motivation Inventory in a competitive sport setting: A confirmatory factor analysis. Res. Q. Exerc. Sport 1989, 60, 48–58. [Google Scholar] [CrossRef]
  34. Fredricks, J.A.; Blumenfeld, P.C.; Paris, A.H. School engagement: Potential of the concept, state of the evidence. Rev. Educ. Res. 2004, 74, 59–109. [Google Scholar] [CrossRef] [Green Version]
  35. Eisenberg, M.; Basman, A.; Hsi, S. Math on a sphere: Making use of public displays in mathematics and programming education. Knowl. Manag. E-Learn. 2014, 6, 140–155. [Google Scholar] [CrossRef]
  36. Ralph, R.A. Post secondary project-based learning in science, technology, engineering and mathematics. J. Technol. Sci. Educ. 2016, 6, 26–35. [Google Scholar] [CrossRef] [Green Version]
  37. Reis, A.C.B.; Barbalho, S.C.M.; Zanette, A.C.D. A bibliometric and classification study of Project-based Learning in Engineering Education. Production 2017, 27, e20162258. [Google Scholar] [CrossRef] [Green Version]
  38. De los Ríos, I.; Cazorla, A.; Díaz-Puente, J.M.; Yagüe, J.L. Project–based learning in engineering higher education: Two decades of teaching competences in real environments. Procedia-Soc. Behav. Sci. 2010, 2, 1368–1378. [Google Scholar] [CrossRef] [Green Version]
  39. Coronado, J.M.; Moyano, A.; Romero, V.; Ruiz, R.; Rodríguez, J. Student Long-Term Perception of Project-Based Learning in Civil Engineering Education: An 18-Year Ex-Post Assessment. Sustainability 2021, 13, 1949. [Google Scholar] [CrossRef]
  40. Farid, T.; Ali, S.; Sajid, M.; Akhtar, K. Sustainability of Project-Based Learning by Incorporating Transdisciplinary Design in Fabrication of Hydraulic Robot Arm. Sustainability 2021, 13, 7949. [Google Scholar] [CrossRef]
  41. Deek, F.P.; Hiltz, S.R.; Kimmel, H.; Rotter, N. Cognitive assessment of students’ problem solving and program development skills. J. Eng. Educ. 1999, 88, 317–326. [Google Scholar] [CrossRef]
  42. Bayrak, F. Associations between university students’ online learning preferences, readiness, and satisfaction. Knowl. Manag. E-Learn. 2022, 14, 186–201. [Google Scholar] [CrossRef]
  43. Weldon, A.; Ma, W.W.K.; Ho, I.M.K.; Li, E. Online learning during a global pandemic: Perceived benefits and issues in higher education. Knowl. Manag. E-Learn. 2021, 13, 161–181. [Google Scholar] [CrossRef]
  44. Preacher, K.J. Advances in mediation analysis: A survey and synthesis of new developments. Annu. Rev. Psychol. 2015, 66, 825–852. [Google Scholar] [CrossRef] [PubMed]
  45. Preacher, K.J.; Rucker, D.D.; MacCallum, R.C.; Nicewander, W.A. Use of the extreme groups approach: A critical reexamination and new recommendations. Psychol. Methods 2005, 10, 178–192. [Google Scholar] [CrossRef] [PubMed]
  46. Slof, B.; Erkens, G.; Kirschner, P.A.; Janssen, J.; Jaspers, J.G.M. Successfully carrying out complex learning-tasks through guiding teams’ qualitative and quantitative reasoning. Instr. Sci. 2012, 40, 623–643. [Google Scholar] [CrossRef] [Green Version]
  47. Wang, M.; Wu, B.; Kirschner, P.A.; Spector, J.M. Using cognitive mapping to foster deeper learning with complex problems in a computer-based environment. Comput. Hum. Behav. 2018, 87, 450–458. [Google Scholar] [CrossRef]
  48. Sorva, J.; Karavirta, V.; Malmi, L. A review of generic program visualization systems for introductory programming education. ACM Trans. Comput. Educ. 2013, 13, 15. [Google Scholar] [CrossRef]
  49. Corbalan, G.; Kester, L.; Van Merriënboer, J.J.G. Dynamic task selection: Effects of feedback and learner control on efficiency and motivation. Learn. Instr. 2009, 19, 455–465. [Google Scholar] [CrossRef]
  50. Sung, H.-Y.; Hwang, G.-J. A Collaborative game-based learning approach to improving students’ learning performance in science courses. Comput. Educ. 2013, 63, 43–51. [Google Scholar] [CrossRef]
  51. Yuan, B.; Wang, M.; van Merriënboer, J.; Tao, X.; Kushniruk, A.; Peng, J. Investigating the Role of Cognitive Feedback in Practice-Oriented Learning for Clinical Diagnostics. Vocat. Learn. 2020, 13, 159–177. [Google Scholar] [CrossRef]
  52. White, B.Y.; Frederiksen, J.R. Inquiry, modelling, and metacognition: Making science accessible to all students. Cogn. Instr. 1998, 16, 3–118. [Google Scholar] [CrossRef]
  53. White, B.; Frederiksen, J. A Theoretical Framework and Approach for Fostering Metacognitive Development. Educ. Psychol. 2005, 40, 211–223. [Google Scholar] [CrossRef]
  54. Zohar, A.; Dori, Y.J. Higher order thinking skills and low-achieving students: Are they mutually exclusive? J. Learn. Sci. 2003, 12, 145–181. [Google Scholar] [CrossRef]
  55. Wu, B.; Wang, M.; Grotzer, T.A.; Liu, J.; Johnson, J.M. Visualizing Complex Processes Using a Cognitive-Mapping Tool to Support the Learning of Clinical Reasoning. BMC Med. Educ. 2016, 16, 216. [Google Scholar] [CrossRef] [PubMed]
  56. Stewart, R.A. Investigating the link between self directed learning readiness and project-based learning outcomes: The case of international Masters students in an engineering management course. Eur. J. Eng. Educ. 2007, 32, 453–465. [Google Scholar] [CrossRef]
Figure 1. Online learning environment.
Figure 1. Online learning environment.
Sustainability 14 12907 g001
Figure 2. Main activities in the PjBL course.
Figure 2. Main activities in the PjBL course.
Sustainability 14 12907 g002
Figure 3. Students’ performance in subject knowledge tests.
Figure 3. Students’ performance in subject knowledge tests.
Sustainability 14 12907 g003
Figure 4. Students’ performance in product quality.
Figure 4. Students’ performance in product quality.
Sustainability 14 12907 g004
Figure 5. Students’ performance in thinking skills: (a) problem understanding; (b) modular design; (c) process design.
Figure 5. Students’ performance in thinking skills: (a) problem understanding; (b) modular design; (c) process design.
Sustainability 14 12907 g005aSustainability 14 12907 g005b
Table 1. Inter-rater reliability.
Table 1. Inter-rater reliability.
MeasuresCohen’s Kappa Coefficientp
Pre-study knowledge tests0.820<0.001
Post-study knowledge tests0.865<0.001
Pre-study product quality0.835<0.001
Post-study product quality0.889<0.001
Pre-study thinking skills
in problem understanding0.852<0.001
in Modular design0.852<0.001
in process design0.900<0.001
Post-study thinking skills
in problem understanding0.871<0.001
in modular design0.894<0.001
in process design0.874<0.001
Table 2. Descriptive statistics of variables.
Table 2. Descriptive statistics of variables.
VariableMinMaxMeanSD
Pre-study knowledge109440.219.45
Post-study knowledge288855.8718.14
Pre-study thinking skills
in problem understanding52013.24.05
in modular design52012.723.93
in process design52012.493.65
Post-study thinking skills
in problem understanding102015.963.29
in modular design102015.53.31
in process design102015.73.25
Pre-study product quality93017.585.39
Post-study product quality104023.828.63
Affective experiences
Interest/Enjoyment2.3353.970.66
Perceived competence2.2553.860.69
Effort/Importance253.950.72
Pressure/Tension152.841.12
Value/Usefulness354.220.46
Table 3. Paired sample t-tests on pre- and post-study knowledge and product quality.
Table 3. Paired sample t-tests on pre- and post-study knowledge and product quality.
VariablePre-StudyPost-Studytpd
MeanSDMeanSD
Subject knowledge40.2019.4555.8718.14−10.39<0.001−1.25
Product quality17.585.3923.828.63−6.20<0.001−0.75
Table 4. Paired sample t-tests on pre- and post-study thinking skills.
Table 4. Paired sample t-tests on pre- and post-study thinking skills.
VariablePre-StudyPost-Studytpd
MeanSDMeanSD
Problem Understanding13.204.0515.963.29−5.91<0.001−0.71
Modular Design12.723.9315.503.31−6.44<0.001−0.78
Process Design12.493.6515.703.25−7.75<0.001−0.93
Table 5. Descriptive statistics of students’ affective experiences.
Table 5. Descriptive statistics of students’ affective experiences.
Affective experiencesMeanSD
Interest/Enjoyment3.970.66
Perceived Competence3.860.69
Effort/Importance3.950.72
Value/Usefulness4.220.46
Pressure/Tension2.841.12
Table 6. ANOVA of students’ pre- and post-study knowledge test scores.
Table 6. ANOVA of students’ pre- and post-study knowledge test scores.
Subject KnowledgeLow-Achieving GroupMedium-Achieving GroupHigh-Achieving GroupFScheffe’s Post Hoc
MeanSDMeanSDMeanSD
Pre-test18.404.4137.908.0165.7410.19171.88 ***High > Medium; High > Low; Medium > Low
Post-test43.4315.3750.2912.1277.218.3841.46 ***High > Medium; High > Low
Gain score25.2413.2312.399.3811.4711.729.61 ***Low > Medium
Low > High
*** p < 0.001.
Table 7. ANOVA of students’ pre- and post-study product quality.
Table 7. ANOVA of students’ pre- and post-study product quality.
Product QualityLow-Achieving GroupMedium-Achieving GroupHigh-Achieving GroupFScheffe’s Post Hoc
MeanSDMeanSDMeanSD
Pre-study14.473.2617.064.4621.536.2410.82 ***High > Medium; High > Low
Post-study23.748.9620.187.6729.846.479.16 ***High > Medium
Gain score9.269.983.117.168.316.934.40 *Low > Medium
*** p < 0.001, * p < 0.05.
Table 8. ANOVA of students’ pre- and post-study thinking skills.
Table 8. ANOVA of students’ pre- and post-study thinking skills.
Thinking SkillsLow-Achieving GroupMedium-Achieving GroupHigh-Achieving GroupFScheffe’s Post Hoc
MeanSDMeanSDMeanSD
Pre-test
- Problem
Understanding
10.843.5512.903.6916.033.5110.02 ***High > Medium; High > Low
- Modular
Design
10.213.5512.843.5615.033.478.83 ***High > Low;
Medium > Low
- Process Design10.053.0312.192.9515.403.3914.41 ***High > Medium; High > Low
Post-test
- Problem
Understanding
16.323.2814.633.2917.792.356.46 **High > Medium
- Modular
Design
15.323.4314.283.0417.682.587.44 **High > Medium
- Process Design15.972.9114.263.1717.792.518.66 ***High > Medium
Gain score
- Problem
Understanding
5.474.741.732.701.763.457.56 **Low > Medium;
Low > High
- Modular
Design
5.113.481.443.082.663.477.26 **Low > Medium
- Process Design5.923.242.062.772.393.3110.33 ***Low > Medium;
Low > High
*** p < 0.001, ** p < 0.01.
Table 9. ANOVA of students’ affective experiences.
Table 9. ANOVA of students’ affective experiences.
Affective
Experiences
Low-Achieving GroupMedium-Achieving GroupHigh-Achieving GroupF
MeanSDMeanSDMeanSD
Interest/Enjoyment4.050.593.920.623.950.810.22
Perceived Competence4.090.573.720.743.860.691.78
Effort/Importance4.280.693.830.693.810.723.01
Value/Usefulness4.370.454.190.494.130.401.39
Pressure/Tension2.681.143.111.252.540.801.76
Table 10. Student comments on the advantages of the PjBL course.
Table 10. Student comments on the advantages of the PjBL course.
ThemeIllustrative ExampleFrequency
Low-Achieving Group
(n = 19)
K (%)
Medium-Achieving Group (n = 31)
K (%)
High- Achieving Group (n = 19)
K (%)
Total
(n = 69)
K (%)
Self-regulated learningThe system allowed me to manage my learning process in an autonomous way.14 (74%)18 (58%)12 (63%)44 (64%)
Problem-solving abilityMy programming problem-solving ability has been improved significantly.10 (53%)17 (55%)10 (53%)37 (54%)
Scaffolding for learningThe scaffolding provided in the system is very helpful.10 (53%)17 (55%)8 (42%)35 (51%)
Knowledge acquisitionThe course helped me to gain a deeper and systematic understanding of programming knowledge.12 (63%)10 (23%)7 (37%)29 (42%)
Knowledge-practice integrationThe course let me master programming knowledge and skills through a practical project.8 (42%)10 (33%)5 (26%)23 (33%)
Motivation for learningMy interest in learning computer programming has been greatly improved.7 (37%)9 (29%)6 (32%)22 (32%)
Learning resourceThere are abundant multimedia learning resources in the system.7 (37%)7 (23%)4 (21%)18 (26%)
Reflective learningThe learning record in the system helped me to find out my problems.2 (11%)8 (26%)7 (37%)17 (25%)
Note. n = number of all students. K = number of students giving responses under each theme. % = K/n.
Table 11. Student comments on the weaknesses of the PjBL course.
Table 11. Student comments on the weaknesses of the PjBL course.
ThemeIllustrative ExampleFrequency
Low-Achieving Group
(n = 19)
K (%)
Medium-Achieving Group (n = 31)
K (%)
High- Achieving Group (n = 19)
K (%)
Total
(n = 69)
K (%)
Technical problemsThere are some bugs in the system.8 (42%)14 (45%)7 (37%)29 (42%)
Teacher-student interactionThe interaction with the teacher is insufficient during the learning program.7 (37%)7 (23%)3 (16%)17 (25%)
Learning difficultyThe project is difficult to me in some extent.3 (16%)8 (26%)4 (21%)15 (22%)
Learning timeI need more time to work on the programming project.4 (21%)6 (19%)2 (11%)12 (17%)
SupervisionSometimes it is hard to focus on learning due to inadequate supervision by the teacher.3 (16%)4 (13%)2 (11%)9 (13%)
BoringThe learning environment is a bit boring.1 (5%)3 (10%)1 (5%)5 (7%)
Note. n = number of all students. K = number of students giving responses under each theme. % = K/n.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Peng, J.; Yuan, B.; Sun, M.; Jiang, M.; Wang, M. Computer-Based Scaffolding for Sustainable Project-Based Learning: Impact on High- and Low-Achieving Students. Sustainability 2022, 14, 12907. https://doi.org/10.3390/su141912907

AMA Style

Peng J, Yuan B, Sun M, Jiang M, Wang M. Computer-Based Scaffolding for Sustainable Project-Based Learning: Impact on High- and Low-Achieving Students. Sustainability. 2022; 14(19):12907. https://doi.org/10.3390/su141912907

Chicago/Turabian Style

Peng, Jun, Bei Yuan, Meng Sun, Meilin Jiang, and Minhong Wang. 2022. "Computer-Based Scaffolding for Sustainable Project-Based Learning: Impact on High- and Low-Achieving Students" Sustainability 14, no. 19: 12907. https://doi.org/10.3390/su141912907

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop