Next Article in Journal
Foresight Methodologies in Responsible GenAI Education: Insights from the Intermedia-Lab at Complutense University Madrid
Previous Article in Journal
Well-Being and Support Network Affiliations for Black and Indigenous College Students during the COVID-19 Pandemic
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Effect of Regular and Innovative Control Devices on Cultivating Creativity in a Game Creating Course in Primary School

Department of Computer Science, National Yang Ming Chiao Tung University, Hsinchu 300093, Taiwan
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(8), 833; https://doi.org/10.3390/educsci14080833
Submission received: 20 May 2024 / Revised: 18 July 2024 / Accepted: 29 July 2024 / Published: 31 July 2024
(This article belongs to the Section Technology Enhanced Education)

Abstract

:
The development of creativity plays a decisive role in the future development of human life as it stimulates divergent thinking and grants the critical ability to innovate and solve problems. Therefore, the question of how to encourage students’ creativity has also attracted the attention of related research in various fields. Our study used the Scratch visual programming tool to allow students to create their own digital games. We used different technological devices as external stimuli during the research process to stimulate students’ creative ideas. We provided four control devices: a keyboard and mouse, PicoBoard, a 65-inch touch screen, and a Wii remote control for 92 fifth-grade students in four S-shaped classes. After a 12-week experimental process, students designed their original Scratch game. The results of the study found that differences in device properties correspond to different items of creativity: devices highly relevant to students’ life experiences improve the flexibility and elaboration of creativity. Innovative controls promote creative originality. The freedom of device control can increase the fluency of creativity. Therefore, providing control devices with different properties will allow teachers to establish learning environments that foster creativity. Finally, we speculate on the impact of other control devices on creativity based on the research results for future reference.

1. Introduction

1.1. Creativity

Creative thinking involves the ability to generate new ideas that can consciously provide multiple possible solutions to a given problem [1,2], allowing people the ability to effectively solve problems and produce original and valuable products [3]. Therefore, creativity is crucial in various disciplines and dominates all areas of life, thereby changing people’s lives [4,5]. At this time, developing children’s creative learning skills is crucial to preparing them for future study and work and should be included in the curriculum from an early age [6,7,8]. In addition to independent cultivation, creativity is often combined with topics such as mathematics, engineering, science, and technology as courses [9], thereby guiding students to apply creativity in various fields. International organizations such as the OECD, World Economic Forum (WEF), UNESCO, and UNICEF have all regarded creativity as a core skill to promote personal growth and lifelong learning. Although there are several definitions of creative thinking [3,10,11,12,13], this study describes these skills as originality, fluency, elaboration, and flexibility [10].

1.2. Game-Based Learning

Lecturer-centered courses will make students quickly bored and lack motivation, creativity and imagination due to limited media exposure and teaching methods, so courses should be conducted in a scientific method and creative way [14,15,16], such as applying higher-order thinking skills to storytelling [17], free drawing training methods [18], digital game-based learning [19], etc. The fun and reward mechanisms of games can play a very influential role in determining the knowledge acquired by learners, prompting students to learn independently and actively [20,21,22]. Its primary purpose is to allow learners to actively participate in their learning process [23]. High levels of curiosity and engagement enable students to focus more on their studies and become more confident as they solve more problems, significantly improving the motivation and performance of low-achieving students in the learning process [24,25,26].
In addition, more student-centered game-creation learning methods have been adopted in recent years, enabling learners to obtain a pleasant and rich learning experience [4,27]. The opportunity for students to do hands-on activities can not only promote the application of practical technology but also allow students to cognitively create their knowledge structure [28,29]. The experience of transforming problem situations into meaningful solutions and tangible artifacts can help address challenges that will change in the future [30,31,32,33,34]. Research has proven that digital platforms that promote programming can encourage and promote creativity [35,36]. To this end, researchers have used a child-oriented programming environment to scaffold students’ creations [37]. When students program their learning games, they need to find creative solutions that meet specific requirements [38], which can develop students’ problem-solving skills [39,40,41,42] and an essential method for planning projects and presenting ideas [43].

1.3. Scratch

However, programming for general game production is complicated for children, so many children cannot master programming syntax at all [44,45,46,47]. Therefore, block-based programming has attracted the attention of researchers and educators who study children’s cognitive and psychological development in constructivist environments [31,48,49]. Students can use visual concrete building blocks to manipulate programs and quickly generate applications to gain practical experience [50,51]. Therefore, visual programming software programs such as Alice 3 [52,53], Code.org [54,55], MIT App Inventor 2 [56,57], Scratch 3.0 [43,45,58,59], etc., are widely used in primary school courses. These programming platforms require little training, and people can develop their programs, games, and animated environments without having to learn the complex code structures of traditional programming languages [45,59,60,61,62,63].
Among these programs, Scratch is the most popular [64]. One of the reasons is that it provides an educational game [65] environment in the classroom, which helps students have fun while stimulating students to create interactive stories, games, and animations to visualize their ideas [66,67]. Making games rather than playing serious games can introduce students to programming, systems thinking, and new literacy skills, with better learning outcomes [27,42,68,69]. Many studies have shown that Scratch provides an efficient environment for teaching programming, making programming courses more interesting, stimulating, and easier to understand, and positively impacting creative thinking skills [70,71,72,73,74,75,76,77,78,79].

1.4. Control Devices

Nowadays, various technological products have appeared among people with the advancement of science and technology. These products enhance convenience, have become a part of our lives, and have had many revolutionary impacts on the game industry. Some companies develop games on popular technology products, such as computers, mobile phones, and tablets, to attract players. Some have developed innovative devices such as Wii, Kinect, and VR devices to enhance people’s sense of motion during games.
In addition to games, some development boards, such as Arduino, support various sensors and provide more diverse interactions based on realistic physical feedback. Digital games are often thought of as being designed on a screen; however, game design with tangible physical operating interfaces can lead to a wealth of creative expression [34,80,81]. Another reason why Scratch is popular is its ability to connect to many hardware platforms, such as the PicoBoard sensor board [82,83,84], the MakeyMakey/Crazyer circuit board [85,86], the Arduino microcontroller, and the Kinect somatosensory device. These platforms can provide a variety of input and output interfaces, allowing students to create interactive media that combines virtual and physical reality [87,88,89]. A creative environment with a high degree of freedom can effectively affect students’ creativity levels [90].
To sum up, various devices can control Scratch for teaching activities. However, now that there are different technological devices, how to choose them to develop students’ creativity effectively is an important research topic. Some studies emphasize ways to enhance the feeling and allow students to interact with physical feedback [34,80,81]; others believe that combining with students’ experience can help creative thinking [91]; and the freedom of equipment gives students more opportunities to apply divergent thinking for creativity development [90]. To do this, we will categorize and select devices with different properties and explore their impact on creativity.
First consider a regular keyboard, mouse, and touchpad. As the most common technological device today, it is the most suitable comparison object. Secondly, the high degree of freedom of development boards such as Arduino means they are often used in maker spaces to design, produce, and learn how to turn ideas into reality, such as in courses on building creative robots. However, in comparison, PicoBoard and Makey Makey are more friendly to novices, and among them, PicoBoard provides students with a higher degree of freedom. PicoBoard can be used as a cheaper and simpler scientific tool that allows students to participate in the learning process through inquiry-based and practical activities [84]. PicoBoard can also make it easier for novice programmers to understand, be more motivated, and participate effectively [82,83]. Therefore, we chose PicoBoard as one of the operating interfaces for this study.
On the other hand, commercial games have already transcended the screen and entered the physical world [87]. New types of controllers include the Nintendo Game Wii Remote, and many in arcades can physically interact with players through controllers or machine buttons and screens. Developing creativity by connecting students’ senses and experiences with familiar problems will help them put new ideas into practice and then apply divergent thinking to solutions [91]. Therefore, we decided to combine the students’ game experience and also use Wii to allow students to design games. In a familiar situation, it should enhance students’ creativity development.

1.5. Creativity Evaluation

The measurement of creativity is one of the core issues in creativity research, which is mainly tricky due to the need for a unified concept of creativity [7,92]. Not only have the various tools and methods for assessing creativity been developed based on their developers’ perceptions of the construct [93], but evaluators’ subjective interpretations can also cause creativity assessment results to vary [94,95]. Therefore, an assessment tool that is objective and free from subjective influences is needed. Most Scratch-related assessment tools focus on students’ problem-solving skills from a programming perspective [41,42,96]. However, our research focuses on the differences in how the properties of different technological devices stimulate students’ thinking, so we will use creative thinking for evaluation.
The most commonly used test to measure creativity is the Torrance Test of Creative Thinking [3], which is designed based on Guilford’s divergent thinking ideas and divides creative thinking into four items, namely fluency, flexibility, originality, and elaboration [97]. Fluency is the ability to generate many ideas or answers to questions quickly. Flexibility means being able to propose multiple approaches to a given problem simultaneously. Originality is the ability to come up with new and original concepts. Elaboration is the ability to classify, organize, and apply ideas in the mind. The method of evaluation is comparing the results of each respondent with the results of other respondents in the sample [98,99,100,101,102].
Although it is difficult for us to find an assessment standard that can fully meet the teaching of creativity [103], domain-related knowledge will still affect the development cornerstone of creativity [13,104,105,106]. This study uses game programming to stimulate students’ creativity. Therefore, choosing technology fields related to programming should be reasonable and appropriate as a creative assessment measure. In the end, we selected Professor Yeh’s technological creativity test from Taiwan [107] based on factors related to technology and experimental area factors. Professor Yeh tried to improve the shortcomings of previous diffuse creative thinking tests and compiled them concerning the scoring directions of domestic and foreign science and technology creativity competitions. The experiment involved 1839 students in grades three to six of elementary schools to establish percentile ranks and T-score norms. The test method is to conduct retests two weeks apart. The test–retest reliability of each ability indicator is between 0.47 and 0.65. The total test–retest reliability is 0.61. The inter-rater reliability is between 0.955 and 1.000. It shows good test–retest reliability and interrater reliability. It shows good criterion-related validity in natural science achievement, natural science learning motivation, creative personal traits, creative school factors, and grade-level criterion-related validity tests.

1.6. Research Aim

This study aims to allow students to design Scratch games for different devices by providing different game control devices. By stimulating relevant experiences of students using different technology application devices in daily life, we will explore the results and performances that trigger creativity in the learning process. The following are our research questions:
  • RQ1. What are the differences in the results of introducing different technological devices to stimulate students’ creativity?
  • RQ2. Which innovative or daily technological device can better enhance students’ creativity?

2. Materials and Methods

2.1. Participants and Procedure

This study adopted a quasi-experimental research method. The same teacher with eleven years of experience in information education conducted a teaching experiment on four fifth-grade classes at Shen-Mei Elementary School in northern Keelung, Taiwan. The selection of classes was randomly assigned, and all students were grouped in an S-shaped pattern based on academic performance in the previous grade. The S-shaped class layout is a method to achieve a normal distribution of classes. The students with the highest scores belong to the first class; the students with the second highest scores belong to the second class, and so on until the last class, and then back from the last class in order, the so-called S-type classification. Therefore, student ability is similar between classes. We excluded 13 students absent from multiple courses to ensure that we could adequately measure the impact of the class on students’ creativity. Our statistical analysis sample consisted of 92 students (Table 1). The students tested had basic information application abilities, such as typing ability, writing-software usage ability, simple drawing ability, web browsing ability, etc. The experimental process lasted 12 weeks after deducting national holidays and midterm and final exams.
All the students tested were beginners in programming. In the first five weeks of the course, keyboard and mouse controls were used to teach students basic programming. The teaching material uses the program example of Scratch’s frog crossing the road. The teacher leads the students to learn Scratch’s program building block interface and basic logic related to programming, such as if-else and for loops. After the introductory course, students took a pre-test on technological creativity. Conducting a pre-test at this time can minimize course errors because subsequent courses will differ according to different groups. Starting from the sixth week, the teacher provides designated control interfaces for different groups and teaches students how to use the device to operate in Scratch. In weeks seven to nine, students must try applying the control interface to Scratch games. Weeks 10 to 12 are left for students to modify their games. We expect students to be creative in using the features and potential of this control in their games. Finally, after all the courses, students took a post-test on technological creativity to compare with the pre-test to explore the differences in creativity between the interfaces used by different groups (Table 2). The following explains the control devices used by different groups and the differences in game programming design:
  • Keyboard and mouse. It is the essential equipment for general computer classes. Students can design buttons into the game to provide various operating functions according to the needs of the game, such as character movement, launching weapons, and performing special moves. For convenience, we will refer to it as the keyboard group below and use it as the experiment’s control group.
  • PicoBoard control board. The development board provides a light sensor, sound sensor, slider, button, and four alligator clips. Using these interfaces, students can design games based on environmental variables. It is similar to how some large-scale game consoles provide interactive physical control methods, allowing the game to have more interactive control possibilities.
  • Touch screen. The 65-inch large touch screen can provide touch operation like the mobile phones owned by most students. The large touch screen in the game design of this experiment allows students to connect with the interactive experience of tablet or mobile phone games in daily life.
  • Wii remote. After going through the system’s underlying settings, we mapped the Wii’s critical functions to the keys on the keyboard to achieve the same function as a game controller without using IR and scrolling modes. When students use Wii remotes, they can connect with past related Wii game experiences and implement them into experimental game designs.

2.2. Technological Creativity Test

This study used the technological creativity test designed by Professor Yeh [108] to evaluate the scientific and technological creativity ability of 3rd to 6th-grade students in elementary schools. It can quantify students’ creative ability and help teachers establish a learning environment suitable for creativity development. This test emphasizes the application of knowledge in the technological field and is relevant to the topic of this study. The Technological Creativity Test provides two types of tests: word association and school bag design. Since this study uses visual programming tools for teaching experiments, we evaluate that the school bag design test is more suitable. It lasts 30 min, during which students use their creativity and imagination to design and draw a school bag. Students can add various imaginative functions to the school bags, such as basic ones that can store stationery and textbooks, to more high-tech anti-theft, mechanical structures, or artificial intelligence, and even to allow the schoolbag to fly or have more sci-fi functions. In addition, students list the various components required to implement each function. Students’ answers will correspond to five evaluation indicators of the creativity test, including fluency, flexibility, originality, elaboration, and visual presentation, and the weighted total score will represent the students’ scientific and technological creativity. The calculation methods for each evaluation index are as follows:
  • Fluency. Count the number of valid answers. If an answer contains features of multiple categories, it will be split into numerous valid answers and scored separately. Instead, we combine similar features into one valid answer. Fluency scoring tests students to come up with a particular creative function. Therefore, a general description like a school bag that can be carried or lifted does not fall into any creative category, so it is not a valid answer. In addition, some students’ answers only drew the components without clearly writing the functions; if the element can correspond to a specific function, it can also be a valid answer, such as a fire-breathing rocket that can function as a flight.
  • Flexibility. Count the number of different categories that valid answers fall into. In the Technological Creativity Assessment Manual [108], the creative functions of school bags are divided into 23 types, as shown in Table 3 below. The book uses many sample answers provided by 1839 students to create a reference answer table. We can objectively evaluate students’ answers against the reference answers.
  • Originality. Scores are calculated from 0 to 2 points based on rarity. The scoring method for this project also needs to be compared with the reference answer table in the evaluation manual. The table details many common to rare creative answers; the corresponding scores are 0 to 2 points. Rare ideas are considered more innovative and, therefore, score higher. If there is no student answer in the reference answer sheet, it will be regarded as a highly creative idea and awarded the highest score of 2 points.
  • Elaboration. Count the number of all valid components. The same components can be re-counted to appear in different creative functions. However, if the duplicate components belong to the same creative function, they will not be counted twice. Components refer to the items or materials needed to implement an innovative function. For example, if a student draws a book, it means that the imaginary school bag has the function of storing textbooks. What is needed is a bag generally made of fabric and then closed with a metal zipper to prevent the books from falling out; therefore, the two components will receive 2 points. The books drawn are after realizing the storage function. Accordingly, the book itself is not considered a practical component of the storage function. In addition, simple graffiti or decorations on school bags do not belong to any creative function and are not practical components. In addition, although students may not describe the required components in detail in words in some creative functions, as long as there are identifiable objects drawn, we will regard them as valid components.
  • Visual presentation. Based on the items or equipment drawn by students on the schoolbag and referring to the creative function types in Table 3 below, calculate the total number of categories included in the schoolbag.
Note that dimensions 1 to 4 are the most common and explainable in creativity research, but visual presentation is rarely mentioned and discussed. Moreover, after the actual implementation of the experiment, this study found that both flexibility and visual presentation counted the number of categories included in Table 3 below that students expressed in the school bag design. Flexibility initially requires students to describe it in writing, while the visual presentation is part of the student’s ability to present it in drawings. However, it is difficult for students to distinguish clearly between the two when answering, making evaluation more difficult. Therefore, we will choose four items other than visual presentation, including fluency, flexibility, originality, and elaboration, to evaluate technological creativity.

2.3. Data Analysis

We followed the instructions in the Technological Creativity Assessment Guide detailed in the previous section and gave different scores to students based on their creative trait performance. After counting the scores, we referred to the fifth-grade score conversion table provided in the manual to first map the students’ four creativity indicators to the T score norm (M = 50/SD = 10) and then convert them into percentile levels for statistical analysis. We used SPSS 20.0 to perform all descriptive, paired t-tests and one-way ANCOVA statistical analyses of changes in technological creativity between groups of students learning with different tools. Note that ANCOVA analysis requires regression homogeneity and homogeneity of variance assumptions. However, the raw flexibility scores in the creativity assessment items did not meet the assumptions, so we squared the raw scores in the ANCOVA analysis to satisfy the homogeneity assumption.

3. Results

3.1. Paired t-Test

The pre-test scores of the technological creativity were fluency (M = 41.45, SD = 29.37), flexibility (M = 26.25, SD = 26.42), originality (M = 46.09, SD = 32.56), and elaboration (M = 44.12, SD = 24.66). The post-test scores of the technological creativity were fluency (M = 48.66, SD = 31.71), flexibility (M = 33.89, SD = 28.93), originality (M = 51.29, SD = 34.34), and elaboration (M = 52.72, SD = 28.73). Overall, the post-test total score (M = 46.86, SD = 30.70) was higher than the pre-test (M = 38.15, SD = 28.36) (Table 4).
To understand the changes in students’ technological creativity abilities, we used paired t-tests to test the improvement of technological creativity abilities between the pre-test and post-test. The results showed that fluency (t(91) = −2.265, p = 0.026, d = 0.236), flexibility (t(91) = −2.580, p = 0.011, d = 0.276), elaboration (t(91) = −2.724, p = 0.008, d = 0.321) and the total score (t(91) = −2.858, p = 0.005, d = 0.295) of the post-test group were significantly higher than those of the pre-test. According to the above results, the teaching method of creating games on Scratch can significantly improve students’ technical creativity. However, there were no significant differences between pre-test and post-test scores for originality (t(91) = −1.301, p = 0.196, d = 0.155) (Table 4).
To explain the above experimental results, we also used paired t-tests to examine the improvement results of each group in various technical creativity abilities to explore the impact of different control devices on creativity in detail. The results show that the learning outcomes of the keyboard group are the same as those of the whole group. The post-test results of technological creativity ability were significantly higher than the pre-test in terms of fluency, flexibility, elaboration, and overall scores. In terms of originality, there was also no significant difference between the post-test and pre-test scores. The results of the touch screen group are also similar to the results of the whole group. The post-test results of technological creativity ability were significantly higher than the pre-test in terms of flexibility, elaboration, and overall scores. However, there was no significant difference between the pre-test and post-test scores for fluency and originality.
However, there was no significant difference between the post-test and the pre-test in the learning outcomes of the PicoBoard group and the Wii group in terms of various technological creativity abilities or total scores. It is worth noting that the post-test results of the Wii group showed a slight decrease in the average scores of all abilities, but there was no significant difference. This part is worth exploring in depth.

3.2. ANCOVA

Since the subjects we selected were students in the same grade with the S-type classification, we should be able to assume that there is no significant difference in the learning abilities of all students. Therefore, if we only focus on the post-test scores, we should be able to compare the effectiveness differences between groups. The results showed that the post-test scores of the PicoBoard group (M = 59.57, SD = 28.37) were significantly higher than those of the keyboard group (M = 50.71, SD = 28.82), the touch screen group (M = 48.36, SD = 34.05), and the Wii group (M = 31.44, SD = 25.43) (Table 5). However, according to the paired t-test results in the previous section, the scores of the PicoBoard group did not show significant differences in each creative ability and total score after the course. Moreover, the pre-test scores of the PicoBoard group also tended to be higher than those of other groups (Table 4). Students’ performance in academic subjects and their creative abilities may differ. Therefore, to understand the impact of the control device used by each group on creativity, we used the ANCOVA method to eliminate the differences between the groups before the course. We performed a one-way ANCOVA analysis on the scores of the four groups using the pre-test scores as covariates.
One-way ANCOVA was used to test learning outcomes between the keyboard group, the PicoBoard group, the touch screen group, and the Wii group. ANCOVA analysis requires homogeneity of regression and variance assumptions, so it is important to note that we square the raw scores to satisfy the homogeneity assumption, as detailed in the data description below.
There was no significant difference in terms of the homogeneity of regression within the group and between different groups and pre-test scores of creativity items, which are fluency (F(3, 84) = 0.295, p = 0.829), flexibility (F(3, 84) = 0.664, p = 0.577), originality (F(3, 84) = 0.392, p = 0.759), elaboration (F(3, 84) = 1.668, p = 0.180), and total score (F(3, 84) = 0.031, p = 0.993). Therefore, this study excluded the influence of pre-test scores and tested the differences in the post-test creativity between students in the different groups.
Moreover, the results of Levene’s test show that there was no significant difference in the error variance of the dependent variable of the four groups in creativity, thus satisfying the variance homogeneity assumption, which are fluency (F(3, 88) = 1.081, p = 0.361), flexibility (F(3, 88) = 1.655, p = 0.183), originality (F(3, 88) = 1.499, p = 0.220), elaboration (F(3, 88) = 2.304, p = 0.082), and total score (F(3, 88) = 1.768, p = 0.159). Therefore, ANCOVA can be applied to test the learning outcomes of four groups with different control devices.
The results from ANCOVA showed that the post-test scores of the fluency (F(3, 87) = 4.179, p = 0.008, η2 = 0.126), flexibility (F(3, 87) = 4.346, p = 0.007, η2 = 0.130), elaboration (F(3, 87) = 4.416, p = 0.006, η2 = 0.132) and total score (F(3, 87) = 2.963, p = 0.037, η2 = 0.093) are significantly different between the four groups. However, the post-test scores of the originality aptitude (F(3, 87) = 2.358, p = 0.077, η2 = 0.075) showed no significant differences between the four groups (Table 6).
We then used Bonferroni post hoc analysis to compare the differences in creative indicators of each group on the ANCOVA results. The fluency of the keyboard group (p = 0.019) and the PicoBoard group (p = 0.026) was statistically significantly higher than that of the Wii group; the flexibility of the keyboard group (p = 0.004) was statistically significantly higher than the Wii group; the elaboration of the keyboard group (p = 0.011) and touch screen group (p = 0.020) was statistically significantly higher than the Wii group (Table 7). In addition, the touch screen group is higher than the Wii group in flexibility, and the PicoBoard group is higher than the Wii group in originality. Although the differences are not statistically significant, we list them for subsequent discussion.
Based on the above results, we found that the results of ANCOVA are similar to the results of the paired t-test in the previous section, and there are significant differences in the items of fluency, flexibility, elaboration, and total score. It will significantly help us explain the impact of different control devices on creativity.

4. Discussion

This study aims to discover the impact of technological tools in daily life on the development of students’ creativity when used in Scratch game design. To explain the reasons for the results presented in the previous chapters, we decided to start with the characteristics of each device. In Section 1.4, we mentioned that if we can provide students with a creative environment with a high degree of freedom [90] or connect students’ senses and related experiences [91], it can effectively cultivate creativity. We used the phenomena observed from the experimental results to further classify the properties of the four control devices in the experiment. We proposed four items: freedom of control, device familiarity, experience gap, and physical sensing (Table 8) to discuss and explain the experimental results.
  • Freedom of control. This refers to how much variability the control device can provide for students to use their imagination, and the creative environment with a degree of freedom can effectively affect the level of creativity expressed by students [90]. In terms of keyboard and mouse, since the keyboard has relatively many keys, it can give students enough diversity to control the game. It can even be not limited to single-player control but can also design multi-player games. Therefore, the keyboard group can give students a high degree of freedom in game design. On the other hand, the PicoBoard control board can provide a variety of sensors, allowing students to design games from different perspectives. However, compared with the keyboard, the number of commands controlling the game is still relatively small, so we classify PicoBoard’s game design freedom as medium. The remaining two groups are the touch screen, which only provides touch functions on the screen, and the Wii remote, which has only seven buttons. These two groups have fewer commands for controlling the game, affecting the variability students can achieve in game design. Therefore, we classify touch screens and Wii remotes as having low degrees of freedom.
  • Familiarity with equipment use. This refers to whether students are familiar enough with the device and even use it regularly. If students need to become more familiar with the control device when designing the game, they must put more effort into understanding how to operate or use it; only when they are familiar enough with the device can they have the energy to add their opinions. In this regard, a keyboard and mouse are the most basic input devices for computer-related equipment, and students’ familiarity with these devices is the highest. On the other hand, most families today have smartphones, which can even be considered daily necessities, so students can also have a high degree of familiarity with touch screens. As for the Wii remote, only some families have it, but students still have a basic understanding of the operation method of the Wii remote, so we can only classify it as medium familiarity. PicoBoard is a tool related to Scratch development, and students may not have been able to use it before, so we classify it as low familiarity.
  • Experience gap. This refers to the differences between students’ expected feelings when designing or experiencing games. The most obvious one is the Wii remote. Since Wii is a relatively popular game console, even if they have not played it, students will still have a certain degree of awareness and expectations about related information or games. The Wii remote is limited in its modifiability, so we can only provide students with the keys on the Wii remote as a game controller. In contrast, the sophistication and complexity of the games provided by the Wii platform software, combined with the IR and scrolling modes of the Wii remote, will make the operability much higher than the games designed through Scratch in this study. Therefore, the Wii remote group will have a high sense of gap in gaming experience with expectations. On the other hand, students often control games on computers or mobile phones through keyboards, mice, or touch screens, which use almost the same devices as keyboard sets and touch screens. Moreover, games designed through Scratch can provide experiences and feedback close to those of ordinary games, and there will not be much difference. Therefore, the sense of gap in the gaming experience is lower. As for PicoBoard, because it is a design tool related to the Scratch platform, there are no related games designed for PicoBoard. Therefore, this property is unsuitable for PicoBoard, so we had to classify it as none.
  • Physical sensing. This refers to whether the game’s control method involves physical sensors. For example, students can use environmental parameters such as light to control the game scene’s day or night, sound volume to control the intensity of operations, or sense body movements to control the avatar. These methods help students practice innovative ideas [91]. Among the devices we selected, only PicoBoard uses physical sensing. Section 1.4 mentions that PicoBoard is a control board with moderate learning difficulty [82,83]. In comparison, Arduino can use more sensing devices, so we classify PicoBoard as a medium sensing property. We then ranked the keyboard and touch screen without sensors as low, and as mentioned in the previous point, the Wii remote can only provide low sensitivity due to system settings.
Based on the above four terms (Table 8), we integrate the paired t-test and ANCOVA results in the previous section. The paired t-test showed significant differences in three creative indicators (fluency, flexibility, and refinement) in the keyboard group and two (flexibility and refinement) in the touch screen group. At the same time, there were no significant differences between the PicoBoard and Wii groups. (Table 4). We further used ANCOVA to analyze the scores of the four groups of students. It can eliminate the influence of different initial levels of creativity among each group of students based on pre-test scores and verify the results of ANCOVA through Bonferroni post hoc comparisons (Table 7). The results showed statistically significant differences in fluency, flexibility, elaboration, and total scores. We then further discussed each creative indicator below.

4.1. Fluency

Fluency only improved significantly in the Keyboard group (t(20) = −2.666, p = 0.015, d = 0.654). Fluency assesses whether students can come up with more creative ideas. The most prominent feature of the keyboard is freedom of control (Table 8). If more than the degree of freedom of the device is needed, it will restrict the space for students to express their creative ideas. On the contrary, when the degree of freedom of the device is higher, students will have more opportunities to express their creativity. Although the PicoBoard group with medium degrees of freedom unfortunately did not obtain statistically significant differences in fluency, from the descriptive statistics (Table 4), the pre-test (M = 54.43, SD = 33.28) and post-test (M = 63.14, SD = 32.09) has significantly improved. There is no significant improvement in fluency in the low freedom of the touch screen (t(24) = −1.705, p = 0.101, d = 0.358) and Wii remote group (t(24) = −1.305, p = 0.204, d = 0.263) due to the restriction of students’ freedom of control. In addition, the ANCOVA results also showed a statistically significant difference between the Keyboard group (Keyboard > Wii, p = 0.019) and the PicoBoard group (PicoBoard > Wii, p = 0.026), verifying our inference.

4.2. Flexibility and Elaboration

Flexibility and elaboration both in the keyboard group (t(20) = −2.489, p = 0.022, d = 0.676), (t(20) = −3.971, p = 0.001, d = 0.816) and the touch screen group (t(24) = −2.139, p = 0.043, d = 0.424), (t(24) = −3.086, p = 0.005, d = 0.614) had a significant improvement. Flexibility allows creative ideas to be applied in more fields, while elaboration clearly expresses the realization and construction of ideas. Among them, the keyboard and touch screen groups stood out regarding familiarity (Table 8). When students become familiar with a control device, they will have a clearer idea of what it can and cannot do. It will also become more specific on how to implement ideas and put them to good use. With the PicoBoard, it was the first time students had come into contact with it, so the low familiarity was challenging to balance between flexibility (t(20) = −0.306, p = 0.762, d = 0.065) and elaboration (t(20) = −0.470, p = 0.643, d = 0.131). In addition, according to the ANCOVA results, the creativity score of the keyboard group is statistically significantly higher than that of the Wii group (Keyboard > Wii, p = 0.004). At the same time, the touch screen still has a certain degree of variability, although it is not statistically significant (touch screen > Wii, p = 0.083). Perhaps the difference is that we did not use the mobile phones commonly used by students, but we used 65-inch large-screen devices to conduct course experiments. However, it is still consistent with our inference for improving students’ creativity.
However, in terms of the Wii remote, most students should have a certain degree of understanding of the Wii remote, so they should have medium familiarity. Still, the results did not change their flexibility (t(24) = 0.314, p = 0.757, d = 0.062) or elaboration (t(24) = 1.171, p = 0.253, d = 0.282); this should be affected by another sense of experience gap. Since the Wii remote cannot enable IR mode and scrolling mode due to system design limitations, students could only control the game through the seven buttons of the controller. Therefore, students’ gaming experience on Scratch differs significantly from games produced by general Nintendo companies, reducing their interest in Wii remotes. From the descriptive statistics (Table 4), we can see that the average scores of this group of students in the pre-test (M = 33.72, SD = 25.96) and post-test (M = 31.44, SD = 25.43) dropped slightly; the same is true for various creative indicators. However, the similar experiences in keyboard and touch screen groups and the first-time PicoBoard group are acceptable. Therefore, the experience gap may be why many learning results in the ANCOVA are significantly higher than those in the Wii group. This result shows that students’ expected gaming experience impacts creativity, which requires special attention when selecting control devices for subsequent research.

4.3. Originality

There was no significant difference in originality among all groups. However, there is a lot of improvement from the scores of PicoBoard’s pre-test (M =59.90, SD = 36.23) and post-test (M = 69.95, SD = 29.94). Although there is no significant difference (t(20) = −1.497, p = 0.150, d = 0.393), PicoBoard specifically stimulates students’ originality. This result is because PicoBoard can connect with the physical world and design games through different sound and light sensing methods, which allows students to have more novel ideas and improves their originality. In the ANCOVA analysis results, although the PicoBoard group failed to reach statistical significance (PicoBoard > Wii, p = 0.084), it is still consistent with our inference for improving students’ creativity.

4.4. Other Technological Devices

The question still remains as to whether there are better controls for cultivating full creative power. We speculate based on the above results. If the daily control device enhances physical sensing properties, it will stimulate students’ originality aptitude. It may be challenging to add sensors for a keyboard and mouse. As for touch screens, today’s mobile phones have many sensors, such as light, distance, and gyroscopes. Mobile phones should be able to introduce more functions into Scratch games in addition to touch screens. In that case, it may be possible to provide more somatosensory stimulation of physical sensing while retaining the property of familiarity, thus cultivating better creativity.
On the other hand, innovative controls have more that could be better, that is, to improve the properties of freedom and familiarity with devices that retain physical sensing properties. Increasing students’ familiarity with the device may require improving course content for device learning or increasing course time to give students more time with the new device to increase familiarity. On the other hand, the freedom of control property requires dealing with various control devices. Nowadays, many control devices use physical sensing. Based on the results of this study, we can speculate on the creative effects if the same curriculum and assessment methods are used (Table 9).
  • General game controller. It refers to a game controller with buttons and direction keys. It has a certain degree of familiarity for students so that it will improve their flexibility and elaboration. However, general game controllers have few buttons, so the fluency may not significantly improve. There is also no physical sensing, so it is challenging to enhance originality.
  • Wii (with IR and scroll mode). If the Wii remote can solve the problem of system limitations, it can provide students with good device familiarity and physical sensing with IR and scroll-mode enabled. We, therefore, speculate that the complete Wii remote can achieve good flexibility, originality, and elaboration results. However, the freedom of control still needs to improve, so there may not be a significant effect on fluency.
  • Kinect. It operates by physically sensing the student’s movements, which should produce good results in originality. However, freedom of control and device familiarity are low. Kinect may not achieve apparent fluency, flexibility, or elaboration results.
  • Smartphone (with more sensors). As mentioned earlier, if we add more mobile phone sensors connected to the Scratch game, we can achieve learning results in originality and retain flexibility and elaboration. However, the freedom of control will not be affected, so there should still be no apparent results in terms of fluency.
  • Arduino UNO. Arduino can connect many sensors for operation, so we expect to increase students’ creativity more than PicoBoard. General Arduino provides 20 pins for students to play freely. This number of operability should be enough to affect fluency. However, the problem with using Arduino is that it is more challenging to learn, and the familiarity will be lower, so it may be difficult to improve students’ flexibility and improvement.

5. Conclusions

This article aims to understand how to choose control devices to stimulate more students’ creativity in Scratch game design courses. Students used their creativity to design Scratch games and successfully improved their creativity [29,38] (Table 4). Compared with previous studies that focused solely on the problem-solving skills brought by programming in Scratch learning outcomes [41,42,96], our findings fill a missing piece of research on how game design enhances student creativity and contributes significantly to research on cultivating student creativity. Below, we draw conclusions corresponding to the research questions raised in Section 1.6.
RQ1. What are the differences in the results of introducing different technological devices to stimulate students’ creativity? We proposed four device properties from the above discussion: freedom of control, device familiarity, experience gap, and physical sensing. And each has a specific impact on creativity. The results summarized the following points:
  • The control device’s freedom of control property positively impacts the fluency aptitude of creativity [90].
  • The control device’s familiarity property positively impacts students’ flexibility and improvement aptitudes of creativity [91].
  • The control device’s physical sensing property positively impacts originality aptitude of creativity [34,80,81].
  • If there is an experience gap between the control device and students’ expectations, it will impair students’ creativity.
RQ2. Does introducing technology applications related to daily life improve students’ demonstration of creativity? Our study selected four control devices: the keyboard and touch screen, which are daily devices familiar to students, while the PicoBoard is an innovative device. Although the Wii remote should be moderately familiar to students, we exclude it from discussion due to system design constraints. The results show that the highly familiar control device has statistically significant effects on fluency, flexibility, and elaboration, while the innovation control device has improved originality. The results showed that control devices with different properties will perform in different creative indicators. Therefore, control devices with specific properties should be selected depending on the direction of creativity you want to cultivate.

6. Limitations and Further Research

Fifth-grade elementary school students in Taiwan were our experimental subjects. Considering the differences in students’ programming abilities, we first taught students basic programming in the early stage of the course before conducting a pre-test to reduce students’ differences in different programming abilities. Follow-up research should pay attention to the differences in various environments and the thinking abilities of students when applying this experiment’s results to students of other regions or other ages.
In addition, the descriptive statistics (Table 4) show that the average score of the class using the PicoBoard is much higher than that of other groups, indicating that subject performance and creativity are not entirely equal. However, there are difficulties in breaking down classes based on student creativity. Subsequent related research should pay attention to this limitation and improve it by referring to the results of this study.
The system design limits the Wii remote used in this study so that the learning results could be better. Therefore, we know that the experience gap significantly impacts the cultivation of students’ creativity. Consequently, we recommend that when selecting experimental tools that are known to students, try to meet students’ expectations; otherwise, it will harm the cultivation of creativity.
There should be more technological devices to foster creativity, but the school curriculum schedule limits the subjects in our study. We can only conduct curriculum experiments on four classes and four types of technological devices. In that case, we predict the creativity-cultivation effects that other more common technological devices can provide based on the research results (Table 9) for reference by future researchers in related fields.
In addition, most physical sensing devices have the problem of control freedom. Another method is to combine two devices and use them simultaneously, i.e., do not be limited to a single control device. For example, starting from a keyboard with a high degree of control freedom and matching it with any device with physical sensing may improve the cultivation of creativity. However, to avoid the influence of different devices on the analysis results, this study did not mix them. It is also necessary to consider whether it is feasible for the keyboard and the sensing device to operate simultaneously. How to combine devices with different characteristics to foster creativity more holistically is a future challenge related to this study.

Author Contributions

Conceptualization, C.-Y.C. and S.-W.S.; methodology, C.-Y.C. and S.-W.S.; validation, C.-Y.C.; formal analysis, C.-Y.C.; investigation, S.-W.S.; resources, S.-W.S.; data curation, C.-Y.C.; writing—original draft preparation, C.-Y.C. and S.-W.S.; writing—review and editing, C.-Y.C.; visualization, C.-Y.C.; supervision, S.-M.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data will be available from corresponding authors on a reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Shamay-Tsoory, S.G.; Adler, N.; Aharon-Peretz, J.; Perry, D.; Mayseless, N. The origins of originality: The neural bases of creative thinking and originality. Neuropsychologia 2011, 49, 178–185. [Google Scholar] [CrossRef] [PubMed]
  2. Hensley, N. Educating for sustainable development: Cultivating creativity through mindfulness. J. Clean. Prod. 2020, 243, 118542. [Google Scholar] [CrossRef]
  3. Torrance, E.P. Torrance Tests of Creative Thinking; Scholastic Testing Service: Bensenville, IL, USA, 1974. [Google Scholar]
  4. Navarrete, C.C. Creative thinking in digital game design and development: A case study. Comput. Educ. 2013, 69, 320–331. [Google Scholar] [CrossRef]
  5. Wöhler, J.; Reinhardt, R. The users’ perspective on how creativity techniques help in the idea generation process—A repertory grid study. Creat. Innov. Manag. 2021, 30, 144–163. [Google Scholar] [CrossRef]
  6. Beghetto, R.A. Creativity in the classroom. In The Cambridge Handbook of Creativity; Cambridge University Press: Cambridge, UK, 2010; pp. 447–463. [Google Scholar]
  7. Said-Metwaly, S.; Van den Noortgate, W.; Kyndt, E. Methodological issues in measuring creativity: A systematic literature review. Creat. Theor.-Res.-Appl. 2017, 4, 276–301. [Google Scholar] [CrossRef]
  8. Vygotsky, L.S. Imagination and creativity in childhood. J. Russ. East Eur. Psychol. 2004, 42, 7–97. [Google Scholar] [CrossRef]
  9. Lucas, B.; Venckute, M. Creativity—A transversal skill for lifelong learning. An overview of existing concepts and practices: Literature review report. JRC Work. Pap; Publications Office of the European Union: Luxembourg, 2020. [Google Scholar]
  10. Guilford, J.P. Creativity. Am. Psychol. 1950, 5, 444–454. [Google Scholar] [CrossRef] [PubMed]
  11. Ausubel, D.P. The Psychology of Meaningful Verbal Learning; Grune & Stratton: New York, NY, USA, 1963. [Google Scholar]
  12. De Bono, E. The CoRT thinking program. In Thinking and Learning Skills; Routledge: London, UK, 2014; pp. 363–388. [Google Scholar]
  13. Csikszentmihalyi, M. Toward a psychology of optimal experience. In Flow and the Foundations of Positive Psychology: The Collected Works of Mihaly Csikszentmihalyi; Springer: Berlin/Heidelberg, Germany, 2014; pp. 209–226. [Google Scholar]
  14. Satria, E.; Widodo, A. View of teachers and students understanding’of the nature of science at elementary schools in Padang city Indonesia. Proc. J. Phys. Conf. Ser. 2020, 1567, 032066. [Google Scholar] [CrossRef]
  15. Satria, E.; Sopandi, W. Applying RADEC model in science learning to promoting students’ critical thinking in elementary school. Proc. J. Phys. Conf. Ser. 2019, 1321, 032102. [Google Scholar] [CrossRef]
  16. Satria, E. Projects for the implementation of science technology society approach in basic concept of natural science course as application of optical and electrical instruments’ material. Proc. J. Phys. Conf. Ser. 2018, 983, 012049. [Google Scholar] [CrossRef]
  17. Aisyah, S.; Setiawan, D. The use of high order thinking skill in story telling method in order to improve children’critical thinking. J. Engl. Educ. 2009, 3, 15–26. [Google Scholar] [CrossRef]
  18. Laili, S.I.; Yuniarti, E.V. Influence of free drawings to improve creativity in 5th grader children in mi mu’awanah al-hasyimiyah. Int. J. Nurs. Midwifery Sci. 2017, 1, 83–88. [Google Scholar]
  19. Behnamnia, N.; Kamsin, A.; Ismail, M.A.B. The landscape of research on the use of digital game-based learning apps to nurture creativity among young children: A review. Think. Ski. Creat. 2020, 37, 100666. [Google Scholar] [CrossRef]
  20. Giannakos, M.N. Enjoy and learn with educational games: Examining factors affecting learning performance. Comput. Educ. 2013, 68, 429–439. [Google Scholar] [CrossRef]
  21. Kalogiannakis, M.; Ampartzaki, M.; Papadakis, S.; Skaraki, E. Teaching natural science concepts to young children with mobile devices and hands-on activities. A case study. Int. J. Teach. Case Stud. 2018, 9, 171–183. [Google Scholar] [CrossRef]
  22. Wells, P.; De Lange, P.; Fieger, P. Integrating a virtual learning environment into a second-year accounting course: Determinants of overall student perception. Account. Financ. 2008, 48, 503–518. [Google Scholar] [CrossRef]
  23. Prentice, R. Creativity: A reaffirmation of its place in early childhood education. Curric. J. 2000, 11, 145–158. [Google Scholar] [CrossRef]
  24. Jarrah, A.M.; Almassri, H.; Johnson, J.D.; Wardat, Y. Assessing the impact of digital games-based learning on students’ performance in learning fractions using (ABACUS) software application. EURASIA J. Math. Sci. Technol. Educ. 2022, 18, em2159. [Google Scholar]
  25. Dele-Ajayi, O.; Strachan, R.; Pickard, A.J.; Sanderson, J.J. Games for teaching mathematics in Nigeria: What happens to pupils’ engagement and traditional classroom dynamics? IEEE Access 2019, 7, 53248–53261. [Google Scholar] [CrossRef]
  26. Huang, Y.-M.; Huang, Y.-M. A scaffolding strategy to develop handheld sensor-based vocabulary games for improving students’ learning motivation and performance. Educ. Technol. Res. Dev. 2015, 63, 691–708. [Google Scholar] [CrossRef]
  27. Ding, A.-C.E.; Yu, C.-H. Serious game-based learning and learning by making games: Types of game-based pedagogies and student gaming hours impact students’ science learning outcomes. Comput. Educ. 2024, 218, 105075. [Google Scholar] [CrossRef]
  28. Piaget, J. To Understand is to Invent: The Future of Education; Grossman Publishers: New York, NY, USA, 1973. [Google Scholar]
  29. Ke, F.M.; Clark, K.; Uysal, S. Architecture game-based mathematical learning by making. Int. J. Sci. Math. Educ. 2019, 17, 167–184. [Google Scholar] [CrossRef]
  30. Tsai, H.-Y.; Chung, C.-C.; Lou, S.-J. Construction and development of iSTEM learning model. Eurasia J. Math. Sci. Technol. Educ. 2017, 14, 15–32. [Google Scholar]
  31. Ng, O.-L.; Cui, Z. Examining primary students’ mathematical problem-solving in a programming context: Towards computationally enhanced mathematics education. ZDM–Math. Educ. 2021, 53, 847–860. [Google Scholar] [CrossRef]
  32. Ng, O.-L.; Ferrara, F. Towards a materialist vision of ‘learning as making’: The case of 3D printing pens in school mathematics. Int. J. Sci. Math. Educ. 2020, 18, 925–944. [Google Scholar] [CrossRef]
  33. Ng, O.-L.; Liu, M.; Cui, Z. Students’ in-moment challenges and developing maker perspectives during problem-based digital making. J. Res. Technol. Educ. 2023, 55, 411–425. [Google Scholar] [CrossRef]
  34. Papert, S.A. Mindstorms: Children, Computers, and Powerful Ideas; Ingram International Inc.: La Vergne, TN, USA, 2020. [Google Scholar]
  35. Resnick, M.; Myers, B.; Nakakoji, K.; Shneiderman, B.; Pausch, R.; Selker, T.; Eisenberg, M. Design principles for tools to support creative thinking. In The Cambridge Handbook of Creativity; Cambridge University Press: Cambridge, UK, 2005. [Google Scholar]
  36. Romero, M.; Lepage, A.; Lille, B. Computational thinking development through creative programming in higher education. Int. J. Educ. Technol. High. Educ. 2017, 14, 42. [Google Scholar] [CrossRef]
  37. Cutumisu, M.; Adams, C.; Lu, C. A scoping review of empirical research on recent computational thinking assessments. J. Sci. Educ. Technol. 2019, 28, 651–676. [Google Scholar] [CrossRef]
  38. Schaumont, P.; Verbauwhede, I. The exponential impact of creativity in computer engineering education. In Proceedings of the 2013 IEEE International Conference on Microelectronic Systems Education (MSE), Austin, TX, USA, 2–3 June 2013; pp. 17–20. [Google Scholar]
  39. Mangaroska, K.; Sharma, K.; Gašević, D.; Giannakos, M. Exploring students’ cognitive and affective states during problem solving through multimodal data: Lessons learned from a programming activity. J. Comput. Assist. Learn. 2022, 38, 40–59. [Google Scholar] [CrossRef]
  40. Zuckerman, O.; Blau, I.; Monroy-Hernández, A. Children’s participation patterns in online communities. Interdiscip. J. E-Learn. Learn. Objects 2009, 5, 263–274. [Google Scholar] [CrossRef]
  41. Chen, Y.; Zhao, Y.; Wang, M. An Empirical Study on the Effect of Gamified Teaching in Scratch Courses on Developing Elementary Students’ Computational Thinking. In Proceedings of the 2024 13th International Conference on Educational and Information Technology (ICEIT), Chengdu, China, 22–24 March 2024; pp. 78–83. [Google Scholar]
  42. Wanglang, C.; Sraubon, K.; Piriyasurawong, P. Combining Game-Based Learning with Design Thinking Using Block-Based Programming to Enhance Computational Thinking and Creative Game for Primary Students. High. Educ. Stud. 2024, 14, 137–147. [Google Scholar] [CrossRef]
  43. Brennan, K.; Resnick, M. New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 Annual Meeting of the American Educational Research Association, Vancouver, BC, Canada, 13–17 April 2012; p. 25. [Google Scholar]
  44. Chou, C.-H.; Su, Y.-S.; Chen, H.-J. Interactive teaching aids integrating building blocks and programming logic. J. Internet Technol. 2019, 20, 1709–1720. [Google Scholar]
  45. Resnick, M.; Maloney, J.; Monroy-Hernández, A.; Rusk, N.; Eastmond, E.; Brennan, K.; Millner, A.; Rosenbaum, E.; Silver, J.; Silverman, B. Scratch: Programming for all. Commun. ACM 2009, 52, 60–67. [Google Scholar] [CrossRef]
  46. Su, A.Y.; Huang, C.S.; Yang, S.J.; Ding, T.-J.; Hsieh, Y. Effects of annotations and homework on learning achievement: An empirical study of Scratch programming pedagogy. J. Educ. Technol. Soc. 2015, 18, 331–343. [Google Scholar]
  47. Wu, S.-Y.; Su, Y.-S. Visual programming environments and computational thinking performance of fifth-and sixth-grade students. J. Educ. Comput. Res. 2021, 59, 1075–1092. [Google Scholar] [CrossRef]
  48. Cui, Z.; Ng, O.-L. The interplay between mathematical and computational thinking in primary school students’ mathematical problem-solving within a programming environment. J. Educ. Comput. Res. 2021, 59, 988–1012. [Google Scholar] [CrossRef]
  49. Richard, G.T.; Giri, S. Digital and physical fabrication as multimodal learning: Understanding youth computational thinking when making integrated systems through bidirectionally responsive design. ACM Trans. Comput. Educ. (TOCE) 2019, 19, 1–35. [Google Scholar] [CrossRef]
  50. Shu, N.C. Visual programming languages: A perspective and a dimensional analysis. In Proceedings of the Visual Languages; Springer: Boston, MA, USA, 1986; pp. 11–34. [Google Scholar]
  51. Eid, C.; Millham, R. Which introductory programming approach is most suitable for students: Procedural or visual programming? Am. J. Bus. Educ. (AJBE) 2012, 5, 173–178. [Google Scholar] [CrossRef]
  52. Moskal, B.; Lurie, D.; Cooper, S. Evaluating the effectiveness of a new instructional approach. In Proceedings of the 35th SIGCSE technical symposium on Computer science education, Norfolk, VA, USA, 3–7 March 2004; pp. 75–79. [Google Scholar]
  53. Dann, W.; Cosgrove, D.; Slater, D.; Culyba, D.; Cooper, S. Mediated transfer: Alice 3 to java. In Proceedings of the 43rd ACM Technical Symposium on Computer Science Education, Raleigh, NC, USA, 29 February–3 March 2012; pp. 141–146. [Google Scholar]
  54. Kalelioğlu, F. A new way of teaching programming skills to K-12 students: Code. org. Comput. Hum. Behav. 2015, 52, 200–210. [Google Scholar] [CrossRef]
  55. Du, J.; Wimmer, H.; Rada, R. “ Hour of Code”: Can It Change Students’ Attitudes Toward Programming? J. Inf. Technol. Educ. Innov. Pract. 2016, 15, 53. [Google Scholar] [CrossRef]
  56. Pokress, S.C.; Veiga, J.J.D. MIT App Inventor: Enabling personal mobile computing. arXiv 2013, arXiv:1310.2830. [Google Scholar]
  57. Patton, E.W.; Tissenbaum, M.; Harunani, F. MIT app inventor: Objectives, design, and development. In Computational Thinking Education; Springer: Singapore, 2019; pp. 31–49. [Google Scholar]
  58. Brennan, K.; Resnick, M. Stories from the scratch community: Connecting with ideas, interests, and people. In Proceedings of the Proceeding of the 44th ACM technical symposium on Computer science education, Denver, CO, USA, 6–9 March 2013; pp. 463–464. [Google Scholar]
  59. Maloney, J.H.; Peppler, K.; Kafai, Y.; Resnick, M.; Rusk, N. Programming by choice: Urban youth learning programming with scratch. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education, Portland, OR, USA, 12–15 March 2008; pp. 367–371. [Google Scholar]
  60. Halbert, D.C. Programming by Example. Ph.D. Thesis, University of California, Berkeley, CA, USA, 1984. [Google Scholar]
  61. Weng, X.; Ng, O.-L.; Cui, Z.; Leung, S. Creativity development with problem-based digital making and block-based programming for science, technology, engineering, arts, and mathematics learning in middle school contexts. J. Educ. Comput. Res. 2023, 61, 304–328. [Google Scholar] [CrossRef]
  62. AbdulSamad, U.; Romli, R. A Comparison of Block-Based Programming Platforms for Learning Programming and Creating Simple Application. In Proceedings of the International Conference of Reliable Information and Communication Technology, Online, 22–23 December 2021; pp. 630–640. [Google Scholar]
  63. Sivilotti, P.A.; Laugel, S.A. Scratching the surface of advanced topics in software engineering: A workshop module for middle school students. ACM SIGCSE Bull. 2008, 40, 291–295. [Google Scholar] [CrossRef]
  64. Coronado, E.; Mastrogiovanni, F.; Indurkhya, B.; Venture, G. Visual programming environments for end-user development of intelligent and social robots, a systematic review. J. Comput. Lang. 2020, 58, 100970. [Google Scholar] [CrossRef]
  65. Alakoç, Z. Technological modern teaching approaches in mathematics teaching. Turk. Online J. Educ. Technol. 2003, 2, 1303–6521. [Google Scholar]
  66. Gonzalez, C. Student Usability in Educational Software and Games: Improving Experiences: Improving Experiences; IGI Global: Hershey, PA, USA, 2012. [Google Scholar]
  67. Rodríguez-Martínez, J.A.; González-Calero, J.A.; Sáez-López, J.M. Computational thinking and mathematics using Scratch: An experiment with sixth-grade students. Interact. Learn. Environ. 2020, 28, 316–327. [Google Scholar] [CrossRef]
  68. Kafai, Y.B.; Kafai, Y.B. Minds in Play: Computer Game Design as a Context for Children’s Learning; Routledge: London, UK, 1995. [Google Scholar]
  69. Hayes, E.R.; Games, I.A. Making computer games and design thinking: A review of current software and strategies. Games Cult. 2008, 3, 309–332. [Google Scholar] [CrossRef]
  70. Kalelioglu, F.; Gülbahar, Y. The Effects of Teaching Programming via Scratch on Problem Solving Skills: A Discussion from Learners’ Perspective. Inform. Educ. 2014, 13, 33–50. [Google Scholar] [CrossRef]
  71. Jiang, B.; Li, Z. Effect of Scratch on computational thinking skills of Chinese primary school students. J. Comput. Educ. 2021, 8, 505–525. [Google Scholar] [CrossRef]
  72. Sarıtepeci, M.; Durak, H. Analyzing the effect of block and robotic coding activities on computational thinking in programming education. Educ. Res. Pract. 2017, 490, 501. [Google Scholar]
  73. Yildiz Durak, H. The effects of using different tools in programming teaching of secondary school students on engagement, computational thinking and reflective thinking skills for problem solving. Technol. Knowl. Learn. 2020, 25, 179–195. [Google Scholar] [CrossRef]
  74. Çatlak, Ş.; Tekdal, M.; Baz, F.Ç. Scratch yazılımı ile programlama öğretiminin durumu: Bir doküman inceleme çalışması. J. Instr. Technol. Teach. Educ. 2015, 4, 13–25. [Google Scholar]
  75. Kobsiripat, W. Effects of the media to promote the scratch programming capabilities creativity of elementary school students. Procedia-Soc. Behav. Sci. 2015, 174, 227–232. [Google Scholar] [CrossRef]
  76. Flanagan, S. Introduce programming in a fun, creative way. Tech Dir. 2015, 74, 18. [Google Scholar]
  77. Kim, H.; Choi, H.; Han, J.; So, H.-J. Enhancing teachers’ ICT capacity for the 21st century learning environment: Three cases of teacher education in Korea. Australas. J. Educ. Technol. 2012, 28, 965–982. [Google Scholar] [CrossRef]
  78. Garneli, V.; Giannakos, M.N.; Chorianopoulos, K.; Jaccheri, L. Serious game development as a creative learning experience: Lessons learnt. In Proceedings of the 2015 IEEE/ACM 4th International Workshop on Games and Software Engineering, Online, 18 May 2015; pp. 36–42. [Google Scholar]
  79. Pacheco, M.; Fogh, R.; Lund, H.H.; Christensen, D.J. Fable II: Design of a modular robot for creative learning. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, DC, USA, 26–30 May 2015; pp. 6134–6139. [Google Scholar]
  80. Buechley, L.; Eisenberg, M. The LilyPad Arduino: Toward wearable engineering for everyone. IEEE Pervasive Comput. 2008, 7, 12–15. [Google Scholar] [CrossRef]
  81. Kafai, Y.B.; Vasudevan, V. Constructionist gaming beyond the screen: Middle school students’ crafting and computing of touchpads, board games, and controllers. In Proceedings of the workshop in primary and secondary computing education, London, UK, 9–11 November 2015; pp. 49–54. [Google Scholar]
  82. DuMont, M. Empowerment through design: Engaging alternative high school students through the design, development and crafting of digitally-enhanced pets. In Proceedings of the 11th International Conference on Interaction Design and Children, Bremen, Germany, 12–15 June 2012; pp. 343–346. [Google Scholar]
  83. DuMont, M.; Lee, V.R. Material pets, virtual spaces, isolated designers: How collaboration may be unintentionally constrained in the design of tangible computational crafts. In Proceedings of the 11th International Conference on Interaction Design and Children, Bremen, Germany, 12–15 June 2012; pp. 244–247. [Google Scholar]
  84. Pierratos, T.; Koltsakis, E.; Polatoglou, H.M. Teaching Physics: Utilization of Scratchboard in Laboratories’ Activities. Proc. AIP Conf. Proc. 2010, 1203, 1442–1446. [Google Scholar]
  85. Silver, J.; Rosenbaum, E.; Shaw, D. Makey Makey: Improvising tangible and nature-based user interfaces. In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction, Kingston, ON, Canada, 19–22 February 2012; pp. 367–370. [Google Scholar]
  86. Kafai, Y.B.; Burke, Q. Connected Code: Why Children Need to Learn Programming; Mit Press: Cambridge, MA, USA, 2014. [Google Scholar]
  87. Eisenberg, M.; Elumeze, N.; MacFerrin, M.; Buechley, L. Children’s programming, reconsidered: Settings, stuff, and surfaces. In Proceedings of the 8th International Conference on Interaction Design and Children, Como, Italy, 3–5 June 2009; pp. 1–8. [Google Scholar]
  88. Golsteijn, C.; Van Den Hoven, E.; Frohlich, D.; Sellen, A. Hybrid crafting: Towards an integrated practice of crafting with physical and digital components. Pers. Ubiquitous Comput. 2014, 18, 593–611. [Google Scholar] [CrossRef]
  89. Horn, M.S.; Crouser, R.J.; Bers, M.U. Tangible interaction and learning: The case for a hybrid approach. Pers. Ubiquitous Comput. 2012, 16, 379–389. [Google Scholar] [CrossRef]
  90. Wong, G.K.-W.; Cheung, H.-Y. Exploring children’s perceptions of developing twenty-first century skills through computational thinking and programming. Interact. Learn. Environ. 2020, 28, 438–450. [Google Scholar] [CrossRef]
  91. Carbonell-Carrera, C.; Saorin, J.L.; Melian-Diaz, D.; De la Torre-Cantero, J. Enhancing creative thinking in STEM with 3D CAD modelling. Sustainability 2019, 11, 6036. [Google Scholar] [CrossRef]
  92. Plucker, J.A.; Makel, M.C.; Qian, M. Assessment of creativity. In The Cambridge Handbook of Creativity; Cambridge University Press: Cambridge, UK, 2010; pp. 48–73. [Google Scholar]
  93. Treffinger, D.J.; Renzulli, J.S.; Feldhusen, J.F. Problems in the assessment of creative thinking. J. Creat. Behav. 1971, 5, 104–112. [Google Scholar] [CrossRef]
  94. Wang, H.-C.; Chang, C.-Y.; Li, T.-Y. Assessing creative problem-solving with automated text grading. Comput. Educ. 2008, 51, 1450–1466. [Google Scholar] [CrossRef]
  95. Cropley, D.H.; Kaufman, J.C. Measuring functional creativity: Non-expert raters and the Creative Solution Diagnosis Scale. J. Creat. Behav. 2012, 46, 119–137. [Google Scholar] [CrossRef]
  96. Su, S.-W.; Chen, L.-X.; Yuan, S.-M.; Sun, C.-T. Cultivating Creativity and Improving Coding Skills in Primary School Students via Domain-General and Domain-Specific Learning Scaffoldings. Educ. Sci. 2024, 14, 695. [Google Scholar] [CrossRef]
  97. Guilford, J.P. Fundamental Statistics in Psychology and Education; Cambridge University Press: Cambridge, UK, 1950. [Google Scholar]
  98. Krumm, G.; Filipppetti, V.A.; Lemos, V.; Koval, J.; Balabanian, C. Construct validity and factorial invariance across sex of the Torrance Test of Creative Thinking–Figural Form A in Spanish-speaking children. Think. Ski. Creat. 2016, 22, 180–189. [Google Scholar] [CrossRef]
  99. Hahm, J.; Kim, K.K.; Park, S.-H. Cortical correlates of creative thinking assessed by the figural Torrance Test of Creative Thinking. NeuroReport 2019, 30, 1289. [Google Scholar] [CrossRef]
  100. Said-Metwaly, S.; Fernández-Castilla, B.; Kyndt, E.; Van den Noortgate, W. The factor structure of the Figural Torrance Tests of Creative Thinking: A meta-confirmatory factor analysis. Creat. Res. J. 2018, 30, 352–360. [Google Scholar]
  101. Kim, K.H. The torrance tests of creative thinking-figural or verbal: Which one should we use? Creat. Theor.-Res.-Appl. 2017, 4, 302–321. [Google Scholar] [CrossRef]
  102. Humble, S.; Dixon, P.; Mpofu, E. Factor structure of the Torrance Tests of Creative Thinking Figural Form A in Kiswahili speaking children: Multidimensionality and influences on creative behavior. Think. Ski. Creat. 2018, 27, 33–44. [Google Scholar] [CrossRef]
  103. Hennessey, B.A.; Amabile, T.M. Story-telling: A method for assessing children’s creativity. J. Creat. Behav. 1988, 22, 235–246. [Google Scholar] [CrossRef]
  104. Amabile, T.M. Creativity in Context: Update to the Social Psychology of Creativity; Routledge: London, UK, 2018. [Google Scholar]
  105. Gruber, H.E.; Davis, S.N. Inching Our Way up Mount Olympus: The Evolving-Systems Approach to Creative Thinking; Cambridge University Press: Cambridge, UK, 1988. [Google Scholar]
  106. Sternberg, R.J.; Lubart, T.I. The concept of creativity: Prospects and paradigms. In The Cambridge Handbook of Creativity; Cambridge University Press: Cambridge, UK, 1999; Volume 1. [Google Scholar]
  107. Yeh, Y.-C. The Development of “Technological Creativity Test” and the Construction of Its Scoring Norm. Psychol. Test. 2004, 51, 127–162. [Google Scholar] [CrossRef]
  108. Yeh, Y.-C. Technological Creativity Test; Psychological Publishing: Taipei, China, 2005. [Google Scholar]
Table 1. Participant numbers and groups.
Table 1. Participant numbers and groups.
GroupStudentsAbsentTotal
Keyboard groupMale13−2112192
Female13−310
PicoBoard groupMale12−21021
Female14−311
Touch screen groupMale12−11125
Female16−214
Wii remote groupMale13−11225
Female14−113
Table 2. Experimental procedure.
Table 2. Experimental procedure.
GroupWeek 1–5-Week 6Week 7–9Week 10–12-
Basic Programming LearningPre-TestIntroduce
Interface
Apply Interface to ScratchScratch Game for Interface ControlPost-Test
Keyboard groupKeyboard and MouseTechnological Creativity TestKeyboard and MouseTechnological Creativity Test
PicoBoard groupPicoBoard
Touch screen groupTouch screen
Wii remote groupWii remote
Table 3. Description of creativity categories in the Technological Creativity Assessment Guide.
Table 3. Description of creativity categories in the Technological Creativity Assessment Guide.
No.CategoriesDescription
1.TransportationA vehicle that can transport something (someone) from one place to another.
2.CourseworkFunctions related to study, homework, and coursework.
3.ConvenienceIt can save effort and time in various tasks or improve life functions.
4.ObedienceBe able to obey orders to serve the master.
5.Computer-related productsProducts related to computer peripherals.
6.CommunicationWays, methods, or tools of communication.
7.Home appliancesVarious electrical appliances, such as washing machines, air conditioners, juice machines.
8.StorageHave various spaces for placing items.
9.Household productsHousehold supplies and things.
10.Food SupplySupply of food, drinks and catering.
11.Appreciate and beautifyTo beautify, decorate, or embellish something; to add to it to make it more interesting.
12.EntertainmentBehaviors or items for people to relax physically and mentally.
13.Oral expressionAble to communicate verbally.
14.TimeHas the function of telling time or timing.
15.TransformationAble to change the original shape for convenience or beauty.
16.MagicImagination or wish that does not exist in the real world.
17.Health and MedicalHuman physical or mental condition, health care products, and treatment methods to combat diseases.
18.ProtectionTools to protect self.
19.WeaponsTools used to attack others.
20.BiochemistryFunctions of biotechnology.
21.TransactionsFunction or place for trading items.
22.ToolsThe use and application of various utensils, tools, instruments, equipment and devices.
23.Energy and environmental protectionRelated to saving and producing energy.
Table 4. Descriptive statistics and paired t-test results of pre-test and post-test scores for each group.
Table 4. Descriptive statistics and paired t-test results of pre-test and post-test scores for each group.
GroupAptitudesPre-Test
Mean (SD)
Post-Test
Mean (SD)
tpCohen’s d
KeyboardFluency36.38 (27.61)54.19 (26.84)−2.6660.0150.654
Flexibility28.24 (24.53)46.38 (28.95)−2.4890.0220.676
Originality42.71 (28.60)51.05 (29.58)−1.1070.2810.287
Elaboration39.57 (24.46)59.95 (25.51)−3.9710.0010.816
Total Score35.19 (25.89)50.71 (28.82)−2.5280.0200.567
PicoBoardFluency54.43 (33.28)63.14 (32.09)−1.4430.1650.266
Flexibility37.52 (27.98)39.38 (29.29)−0.3060.7620.065
Originality56.90 (36.23)69.95 (29.94)−1.4970.1500.393
Elaboration54.62 (27.47)58.00 (24.18)−0.4700.6430.131
Total Score51.76 (31.33)59.57 (28.37)−1.3170.2030.261
Touch screenFluency37.28 (29.54)48.52 (33.13)−1.7050.1010.358
Flexibility24.00 (27.93)36.56 (31.23)−2.1390.0430.424
Originality40.20 (31.86)44.44 (35.19)−0.5000.6210.126
Elaboration39.08 (23.66)57.24 (34.48)−3.0860.0050.614
Total Score33.64 (28.08)48.36 (34.05)−2.2460.0340.472
Wii remoteFluency38.96 (25.38)32.00 (27.51)1.3050.2040.263
Flexibility17.36 (22.61)16.12 (16.91)0.3140.7570.062
Originality45.72 (32.88)42.68 (36.40)0.4160.6810.088
Elaboration44.16 (21.83)37.68 (24.15)1.1710.2530.282
Total Score33.72 (25.96)31.44 (25.43)0.4360.6670.089
All groupFluency41.45 (29.37)48.66 (31.71)−2.2650.0260.236
Flexibility26.25 (26.42)33.89 (28.93)−2.5800.0110.276
Originality46.09 (32.56)51.29 (34.34)−1.3010.1960.155
Elaboration44.12 (24.66)52.72 (28.73)−2.7240.0080.321
Total Score38.15 (28.36)46.86 (30.70)−2.8580.0050.295
Table 5. Descriptive statistics of post-test scores for each group.
Table 5. Descriptive statistics of post-test scores for each group.
AptitudeKeyboard
(N = 21)
PicoBoard
(N = 21)
Touch Screen
(N = 25)
Wii Remote
(N = 25)
ALL Group
(N = 92)
Fluency54.19 (26.84)63.14 (32.09)48.52 (33.13)32.00 (27.51)48.66 (31.71)
Flexibility46.38 (28.95)39.38 (29.29)36.56 (31.23)16.12 (16.91)33.89 (28.93)
Originality51.05 (29.58)69.95 (29.94)44.44 (35.19)42.68 (36.40)51.29 (34.34)
Elaboration59.95 (25.51)58.00 (24.18)57.24 (34.48)37.68 (24.15)52.72 (28.73)
Total Score50.71 (28.82)59.57 (28.37)48.36 (34.05)31.44 (25.43)46.86 (30.70)
Table 6. ANCOVA results of pre-test and post-test scores for each group.
Table 6. ANCOVA results of pre-test and post-test scores for each group.
AptitudeSourceSSdfMSFpη2
FluencyPre-test19,612.743119,612.74328.4940.000
Groups8628.45132876.1504.1790.0080.126
Error59,883.30687688.314
Flexibility (squared)Pre-test106.8131106.81319.6410.000
Groups70.904323.6354.3460.0070.130
Error473.130875.438
OriginalityPre-test9387.34519387.3459.3260.003
Groups7121.09032373.6972.3580.0770.075
Error87,568.160871006.531
ElaborationPre-test10,748.590110,748.59016.5540.000
Groups8602.07632867.3594.4160.0060.132
Error56,490.36387649.315
Total ScorePre-test18,701.326118,701.32628.3720.000
Groups5859.32231953.1072.9630.0370.093
Error57,346.02387659.150
Table 7. Post-hoc analysis.
Table 7. Post-hoc analysis.
AptitudeBonferroni test
FluencyKeyboard > Wii (p = 0.019)PicoBoard > Wii (p = 0.026)
FlexibilityKeyboard > Wii (p = 0.004)Touch screen > Wii (p = 0.083)
OriginalityPicoBoard > Wii (p = 0.084)
ElaborationKeyboard > Wii (p = 0.011)Touch screen > Wii (p = 0.020)
Total score
Table 8. Device property.
Table 8. Device property.
PropertyKeyboardPicoBoardTouch ScreenWii Remote
Freedom of controlHighMediumLowLow
Device familiarityHighLowHighMedium
Experience gapLow-LowHigh
Physical sensingLowMediumLowLow
Table 9. Predicting the outcomes of other technological devices on creative abilities.
Table 9. Predicting the outcomes of other technological devices on creative abilities.
No.Control DeviceFluencyFlexibilityOriginalityElaboration
1.General game controller
2.Wii (with IR and scroll mode)
3.Kinect
4.Smartphone
5.Arduino UNO
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chen, C.-Y.; Su, S.-W.; Yuan, S.-M. The Effect of Regular and Innovative Control Devices on Cultivating Creativity in a Game Creating Course in Primary School. Educ. Sci. 2024, 14, 833. https://doi.org/10.3390/educsci14080833

AMA Style

Chen C-Y, Su S-W, Yuan S-M. The Effect of Regular and Innovative Control Devices on Cultivating Creativity in a Game Creating Course in Primary School. Education Sciences. 2024; 14(8):833. https://doi.org/10.3390/educsci14080833

Chicago/Turabian Style

Chen, Chien-Yu, Shih-Wen Su, and Shyan-Ming Yuan. 2024. "The Effect of Regular and Innovative Control Devices on Cultivating Creativity in a Game Creating Course in Primary School" Education Sciences 14, no. 8: 833. https://doi.org/10.3390/educsci14080833

APA Style

Chen, C. -Y., Su, S. -W., & Yuan, S. -M. (2024). The Effect of Regular and Innovative Control Devices on Cultivating Creativity in a Game Creating Course in Primary School. Education Sciences, 14(8), 833. https://doi.org/10.3390/educsci14080833

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop