Next Article in Journal
Augmented Reality-Based Framework Supporting Visual Inspection for Automotive Industry
Next Article in Special Issue
WOJR: A Recommendation System for Providing Similar Problems to Programming Assignments
Previous Article in Journal
Photovoltaic Roundabouts for Enhancement of Self-Sufficiency and Resiliency
Previous Article in Special Issue
A New TPACK Training Model for Tackling the Ongoing Challenges of COVID-19
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking

by
Chacharin Lertyosbordin
1,*,
Sorakrich Maneewan
1,* and
Matt Easter
2
1
Faculty of Industrial Education and Technology, King Mongkut’s University of Technology Thonburi, Bangkok 10140, Thailand
2
College of Education and Human Development, University of Missouri, Columbia, MO 65201, USA
*
Authors to whom correspondence should be addressed.
Appl. Syst. Innov. 2022, 5(3), 47; https://doi.org/10.3390/asi5030047
Submission received: 7 March 2022 / Revised: 9 April 2022 / Accepted: 28 April 2022 / Published: 30 April 2022
(This article belongs to the Special Issue Applied Systems on Emerging Technologies and Educational Innovations)

Abstract

:
Robot programming skill classes are becoming more popular. Higher order thinking, on the other hand, is an important issue in developing the skills of 21st-century learners. Truth be told, those two abilities are consistent subjects that are trending in academics. The purpose of this study is to design the components and indicators of a robot programming skill assessment based on higher order thinking. The methodology is divided into two phases: (1) qualitative research: a review of the literature on the issues for the synthesis of components and indicators of the robot programming skill assessment based on higher order thinking; and (2) quantitative research: to test the validity of the robot programming skill assessment by the content validity index test (CVI) with seven experts and the reliability with Cronbach’s alpha statistic test with the questionnaire results from 50 participants. The results show that the synthesized robot programming skill assessment consists of three components with 16 indicators, all of which are accepted for their agreed content validity index assessment (CVI = 1.00), and the internal consistency calculation results for the reliability test are found to have an acceptable reliability (α = 0.747).

1. Introduction

Robot programming courses continue to emerge [1]. In line with the OECD 2030 Future Skills Report, it has clearly shown that the job in robotics engineering is an important professional field [2]. On the other hand, higher order thinking (HOT) is also an important issue in developing 21st-century learners’ skills [3,4,5,6].
Robot systems typically consist of processing sensor data, perform recognition, and plan their operations using computer programs running on the processor [7,8,9]. Therefore, robot programming is entering a set of instructions that direct the robot to work by taking values from the inputs to generate the outputs [10,11,12,13,14,15]. Additionally, in the programming steps, there is a universally recognized procedure, which consists of: (1) Identifying the problem, (2) Goal setting, (3) Creating the solution, (4) Acting on the solution and (5) Returning to check the results [16,17,18,19,20,21]. The behaviors of the five processes mentioned above all occur because of higher order thinking in the brain [22,23,24,25].
The concept of higher thinking skills became an important educational topic in 1956 when Bloom et al. [26] published the taxonomy of educational objectives and referred to higher thinking skills as analysis, synthesis, and evaluation. Later, in 2001, Bloom’s taxonomy of educational objectives was revised by Anderson et al. [27]. They modified the six levels of Bloom’s cognitive domain [26], but continued to define higher order thinking as starting at the analytic thinking stage. The details are presented in Table 1.
What the researchers described about “robot programming” and “higher order thinking” was based on the research from various references. Suffice to say, those two skills were consistent. Therefore, in this research, we design the components and indicators of robot programming skills based on higher order thinking and experiment with those indicators as one piece of evidence for the instructor to apply or build on in measuring and evaluating learners’ skills arising from the learning activities.

2. Literature Review

In conducting this research, the researchers reviewed the literature to synthesize the components and indicators of the robot programming skill assessment based on higher order thinking. This aspect is divided into two parts: (1) robot programming and (2) higher order thinking. The details are as follows:

2.1. Robot Programming

The OECD [2] has released the 2030 Future of Education and Skills report. It outlines the robot engineer’s job, which requires important skills due to the demand of technology for the future. It also emerged from an analysis of the industrial robotics market that robot software will be used in robot operations to achieve the specific objectives through the computer program coding [28]. It will exponentially grow between 2019 and 2025 due to the adoption of the Internet of Things (IoTs), Artificial Intelligence (AI), and other software technologies [29]. Many educational institutions are now adopting robotic programming as a part of their efforts to enhance students’ higher order thinking skills, beginning with the MIT Media Lab under the supervision of Professor Seymour Papert since 1985 [30]. To date, the use of robot programming processes has often been used in STEM (science, technology, engineering and mathematics) learning management [31,32,33,34].
In the article, the researchers investigate the two words “Robot” and “Programming” to determine with clarity the meaning of robot programming skills. We studied the meanings of both words by examining the meanings in scholarly dictionaries. We found that the Oxford Dictionary of Computing defines the term “robot” as “programmable devices consisting of mechanical actuators and sensory organs that are linked to a computer” [35], whereas the Oxford Dictionary of Computer Science defines the term “programming” as “all technical activities involved in the production of a program, including analysis of requirements and all stages of design and implementation. In a much narrower sense, it is the coding and testing of a program from some given design” [36].
From the meanings of the two terms presented above, it can be concluded that robot programming skills refer to “All technical activities related to the production of programs through the coding and testing of the program from the given design to the control of programmable devices consisting of actuators and sensors linked to the computer” [35,36]. This definition is consistent with the programming process synthesized by Lertyosbordin [15] with other academic sources [35,36,37,38,39,40,41,42,43]. The detail of the programming process consists of the following steps:
(1)
Identify the Problem: This refers to understanding the problem and determining the “Input”, “Process” and “Output” components that must be completed in order to solve the problem.
(2)
Design a Solution: This refers to the process of ordering the sequence of algorithms using flowcharts or pseudocodes.
(3)
Coding the Program: This is the way of transforming the commands and procedure sequence from the conceptual design into a programming language.
(4)
Test the Program: This refers to the validation of the syntax of the computer code and the interpretation of the results for the goals of program execution. It also includes testing for hardware compatibility, covering the input and output sections.
(5)
Program Implementation: This refers to the outcomes of the program. This should also be continued by further enhancements.
The researchers have defined the robot programming definition and synthesized the standard programming process [37,38,39,40,41,42,43]. We can then determine the components and indicators of the robot programming skill by applying the coding skills indicator of Surfing Scratcher [44], which was developed based on creating an educational measurement of Griffin [45], combined with the cognitive skills indicating verbs of Schraw and Robinson [46]. In addition, the researchers also analyzed the usage of verbs found in a variety of empirical studies [47,48,49,50,51,52,53] that evaluate cognitive skills in robot control programming tasks. The components and indications of the robot programming skill can then be synthesized as follows:
(1)
Component 1: The ability to solve problems step by step:
  • Describe the problem and the sequence of ways to solve it.
  • Draw the flowcharts or pseudocodes to show the sequence of ways to solve the problems.
  • Change the sequence of steps if the results are not met.
  • Tackle the presented tasks by breaking them down into smaller tasks.
  • Capture the issues that can cause problems to repeat.
(2)
Component 2: The ability to create computer programs:
  • Create a program by a computer language from a blank page.
  • Create a program with a single-decision condition.
  • Create a program with the nested structure of decision conditions.
  • Create a variable to control the loop task programs.
  • Create a variable and input data that affect the output.
  • Build your own program from the beginning, until you achieve the objectives.
  • Create a function that can modify parameters.
(3)
Component 3: The ability to connect to the robot:
  • Connect the port between the computer and the microcontroller.
  • Create objects for using analog and/or digital signals.
  • Create a graphical user interface (GUI) to display the analog and/or digital inputs.
  • Create a graphical user interface (GUI) for the digital outputs.

2.2. Higher Order Thinking

The concept of higher order thinking skills became an important educational topic when Bloom et al. [26] published the Taxonomy of Educational Objectives and described higher order thinking skills as Analysis, Synthesis and Evaluation. Later, in 1987, Resnick [54] researched the teachings of science and mathematics with an educational theme focusing on higher order thinking among public school students across the United States. The studies have shown that higher order thinking skills are important skills in the scientific thinking process and can be developed from the elementary school level and onward. In addition, Resnick said “Higher order thinking involves a cluster of elaborative mental activities requiring nuanced judgment and analysis of complex situations according to multiple criteria. Higher order thinking is effortful and depends on self-regulation” [54].
This is consistent with Lewis and Smith [55] who concluded that higher order thinking skills are the processes used to respond to situations through critical thinking and problem solving. Moreover, King et al. [56] stated that “Higher order thinking skills include critical, logical, reflective, metacognitive, and creative thinking. They are activated when individuals encounter unfamiliar problems, uncertainties, questions, or dilemmas” [49]. Later, in 2001 Anderson et al. [27] revised Bloom’s Taxonomy of Educational Objectives, pointing out that learners’ thinking characteristics should be divided into two dimensions consisting of the “Knowledge Dimension” and “Cognitive Process Dimension”. They have also modified Bloom et al.’s six stages of cognitive levels [26], but still define higher order thinking as starting at the analytic thinking stage, detailed in Table 1.
From the details of higher thinking skills mentioned above, it can be concluded that higher thinking skills refer to “the intellectual ability from the application of knowledge to the creation of new ideas of one’s own” [26,27,54,55,56]. In this research, we used the higher order thinking skills theory based on the revision of Bloom’s cognitive taxonomy of Anderson et al. [27], as the basis for designing the component and indicators in this study. The revision of Bloom’s cognitive taxonomy consists of the details about of “Knowledge Dimensions” and “Cognitive Process Dimensions”, which are as follows:
(1)
The knowledge dimensions:
  • Factual—The fundamental understanding of terminology; scientific terms; labels; lexicon; slang; symbols or representations, and specifics, such as a knowledge of events, individuals, events, and information sources.
  • Conceptual—Knowledge of a subject’s classifications and categories, concepts, theories, models, or frameworks.
  • Procedural—Knowing how to perform a skill, procedure, technique, or methodology.
  • Metacognitive—The method or approach of learning and thinking, being aware of one’s own cognition and being able to control, monitor and regulate one’s own cognitive process.
(2)
The cognitive process dimensions:
  • Analyze—Breakdown a component and determine how the parts relate to one another and to an overall concept or purpose by differentiating, organizing, and attributing.
  • Evaluate—Make decisions utilizing criteria and standards by checking and critiquing.
  • Create—Integrate elements to create a coherent or functional whole; reorganize elements to create a new structure or pattern by generating, planning, and producing.
The higher order thinking assessment flourished in the 19th century to verify the validity of teaching methods for specific objectives and tried to determine the standard level of learners in each grade [57]. To date, the knowledge and cognitive process assessment has been used as part of building student enthusiasm and leading to the development of learners’ skills in accordance with the learning objectives [58]. Corliss and Linn [59] suggested a method for measuring thinking skills in scientific learning activities, presented in Table 2.
From Table 2, we found that higher thinking skills can arise in the scientific thinking process, where teachers can measure and assess students’ skills through learning activities. Therefore, in this study, the researchers used the higher order thinking skills dimension of Anderson et al. [27], which consists of the knowledge dimension and cognitive process dimension, presented in Table 3.
From Table 3, we can identify the higher order thinking by these 12 behavior indicators. In addition, in Computational Science, these behaviors refer to a group of competencies known as computational thinking. Selby [60] determines that the relationship of higher order thinking skills is directly linked to computational thinking, which consists of decomposition, abstraction, algorithm design, generalization, and evaluation. The relation is shown in Figure 1.
From Table 3 and Figure 1, it can be observed that we then have the higher order thinking indicators in the programming pedagogy. However, to assess the level of skill for each attribute, it is necessary to have a numerical rating scale to measure the performance. Leighton [61] shows that measurements can be made based on line 0–100 and divided into 5 levels (0–4), shown in Figure 2.
The rating scale was divided into five ranges, shown in Figure 2. This corresponds to Likert [62] that supports the design of the rating scale, which should be an odd number (3, 5, 7, …). If we consider Figure 2, it can be observed that the teachers should not assess learners by dividing them into only two sides (white and black) because some learners’ abilities are grayed out. Therefore, defining the middle of the scale is another suitable way to assess learners’ abilities more clearly. This led to the specific research method for testing the robot programming skill’s indicators through the assessment by using a rating scale with five ranges (0–4), as shown in the research methodology.

3. Objectives

(1)
To synthesize the components and indicators of the robot programming skill assessment based on higher order thinking.
(2)
To evaluate the validity and reliability of the robot programming skill assessment based on higher order thinking.

4. Methodology

4.1. The Details of the Participants in This Research

(1)
To test the validity of the components and indicators in the form of a questionnaire instrument in terms of the robot programming skill based on higher order thinking, the researcher worked with seven experts from various fields, whose qualifications were as follows:
  • The Ph.D. lecturers in Educational Evaluation; two persons.
  • The Ph.D. lecturers in Computer Engineering; two persons.
  • The Ph.D. lecturer in Educational Technology; one person.
  • The Ph.D. lecturer in Psychology; one person.
  • The psychiatrist with at least 5 years of adolescent behavior research experience; one person.
(2)
To test the reliability of the components and indicators, the researchers used the robot programming assessment instrument with 50 volunteers in a robot programming skills training program—July 2021 course of the MARA: Manufacturing Automation and Robotics Academy, Ministry of Industry, Thailand. The participant acquisition was due to the public announcement made by the Department of Skill Development via the MARA website [63] in June 2021. Within three weeks of the announcement, 200 people had signed up for the training course. The researchers then set a quota of 50 technician volunteers to use the assessment instrument. All the participants were industrial plant technicians who had no prior experience of programming a robot before enrolling in the training course.

4.2. The Details of the Research Instrument

After identifying the key components and indicators of the robot programming skill assessment based on higher order thinking, we created the instrument to measure robot programming skills. The instrument utilizes three components with 16 indicators and is designed for trainees to rate themselves on a four-point scale as follows:
0 points means it cannot be done.
1 point means not complete with the use of manuals or some other aids.
2 points means accomplished by always using manuals or some other aids.
3 points means accomplished by sometimes using manuals or some other aids.
4 points means complete it yourself without a manual or some other aids.

4.3. The Details of the Synthesis of the Components and Indicators

A systematic review and analysis based on the following research question: “How does robot programming affect higher order thinking”? The steps for inclusion/exclusion, criteria of data sources and search strategies are described below.
(1)
Inclusion criteria:
  • Published between 2013 and 2022.
  • Include articles with search terms in the title and abstract.
  • Include experimental research publications in the search.
  • Include papers for which the abstract content corresponds to the research question.
(2)
Exclusion criteria:
  • There is no complete article.
  • Unrelated to research due to inconsistency with the research question.
  • Duplicate study (if there are multiple databases).
  • Insufficient information.
(3)
Data sources and search strategies:
The studies included in this scoping review (systematic review) were located via a comprehensive search of the publicly available literature through manual electronic searches of SCOPUS, IEEE, and Thai-JO. The search strategies varied according to the tool used. The search terms included the following keywords: “higher order thinking” or “problem solving” or “critical thinking” or “computational thinking” with “robot” and “programming” or “coding”. In Figure 3, the diagrams show the literature search process, in which the studies were identified, screened, and evaluated for inclusion. Based on the criteria, seven papers were chosen for the final analysis.

4.4. The Details of the Evaluations of the Components and Indicators

The findings of the systematic review are presented below, and here we detail the steps that were taken to collect the data:
(1)
An initial assessment of the content validity for all components and indicators was conducted by having seven experts perform the evaluation using a content validity index test (CVI) [64]. This process led to minor revisions of some key language, but the original content remained the same.
(2)
After revising the instrument, the questionnaires in a Google Forms link was provided to the trainer who supervised the robot programming skill training—July 2021 course of the Manufacturing Automation and Robotics Academy, Ministry of Industry, Thailand. Subsequently, the trainer provided the Google Forms link to trainees, who were a sample group, to rate themselves after they finished the course.
(3)
The collected data was analyzed by using Cronbach’s Alpha statistic [65] to examine the reliability of the components and indicators of the robot programming skill assessment based on higher order thinking.

5. Results

(1)
The researchers discovered seven empirical studies that could be used for synthesizing the components and indicators of the robot programming skill assessment based on higher order thinking, after conducting a literature study using the scoping review analysis. The results are shown in Table 4 and Table 5.
(2)
The researchers developed the components and indicators of robot programming skills based on higher order thinking by combining programming procedures with the verbs that indicate cognitive skills analyzed from the scoping review. The results are show in Table 6.
(3)
The validity test conducted by seven experts showed all 16 items measuring the three components reached an acceptable validity based on the content validity index test (CVI = 1.00).
(4)
The reliability analysis used Cronbach’s Alpha statistic to examine the internal consistency of the components and indicators. The results for all 16 indicators were a Cronbach’s Alpha valued at 0.747. Moreover, the analysis of the questionnaire’s reliability for the three components of the questionnaire consisted of: (1) The ability to solve problems step by step (α = 0.827), (2) The ability to create computer programs (α = 0.722), and (3) The ability to connect to the robot (α = 0.778). Since the results of all the components using Cronbach’s Alpha calculations appear to be greater than 0.7, we then can conclude that the individual components and the overall indicators were acceptable reliability.

6. Discussion

The components and indicators of the robot programming skill assessment based on high order thinking should be used to measure students’ skills by assessing the effect on the students’ ability to express what they have learned, also called “authentic assessment” [66], in which it is difficult to provide the instructor with the exact assessment and judgment of the workpiece. It is essential to establish the scoring and quality criteria [67]. As a result, after the components and indications were designed and created, we tested their validity and discovered that the index of the item-objective congruence (IOC) was 1.00.It may interpret the accepted validity, which implies that all indicators have accurately and suitably established the measurement concerns. It can also be practically utilized. The examination for the validity of the components and indicators in this study are compatible with the results of Rovinelli and Hambleton [68], who commented on the construction of any measure’s formulation, which should be validated before it is employed. This is consistent with the research of Müller et al. [69], who explained that assessing the measuring devices with several assessors can reveal the instrument’s accuracy.
In this study, the created components and indicators were evaluated using 50 volunteers and the results were examined with Cronbach’s Alpha statistic. This is the process that measure’s the reliability, or internal consistency of the components and indicators [70]. The Cronbach’s Alpha scores are higher than 0.7. It is typically accepted for interpreting the Likert scale questions, in general [71]. After the data analysis, the calculated results of the Cronbach’s Alpha statistic test for 16 indicators was 0.747. This means that all 16 indicators have acceptable internal consistency. In addition, the analysis for the created three components consisted of: (1) The ability to solve problems step by step, (2) The ability to create computer programs, and (3) The ability to connect to the robot, which appeared 0.827, 0.722, and 0.778, respectively. They all demonstrate that they have acceptable internal consistency. The purpose of this part in the research is to test the precision of the measurements. This is an important process when measuring the skill of learners through the authentic assessment [72], which corresponds with the results obtained by Segal et al. [73], who employed a variety of evaluators to develop learning behavior measurement tools that could provide accurate and reliable data in the assessments of learners’ reactions.
From the result of the validity and reliability analyses that the researchers previously mentioned, the created components and indicators were acceptable from the validity and reliability tests. This may be due to the following reasons:
(1)
The components were established from the literary review of the “robot programming” [37,38,39,40,41,42,43]. It is also consistent with the training courses that use the same universal programming process.
(2)
The robot programming skill is a higher order thinking skill based on Bloom’s cognitive taxonomy that falls into three categories: problem solving, critical thinking and the transfer of knowledge and skills [74]. The researchers can provide additional details as follows:
  • Component 1 (The ability to solve problems step-by-step) is the main ability of the robot programming skill. It conforms to the meaning of the following phrase: “Problem-solving approach”, defined by the APA Dictionary of Psychology [75] as “The process whereby difficulties, obstacles, or stressful events are addressed using coping strategies.”
  • Component 2 (The ability to create computer programs) is a part of the problem-solving skill that conforms to Jonassen [76], who details that programming activities could be classified as one solution for the “Design Problem Solving” type that focuses on analysis and planning. This also corresponds to Chandrasekaran [77], who determined that the key to problem solving is a step of critical thinking that understands the problem and defines the structure and sequence of work to fix the problem.
  • Component 3 (The ability to connect to the robot) is a part of applying the knowledge about the robot modules that direct the robot to work by taking values from the inputs to generate the outputs. It conforms to Matsun et al. [78], who used Arduino Uno microcontroller programming as a tool for scientific learning, which confirms that this tool can promote the higher order thinking of students from learning activities, such as the hypothesis about the relationship between input and output modules, testing the solution, observing the results, and improving the processes obtained from the results displayed by the system. This also corresponds to Avello-Martínez et al. [79], who mentioned that allowing students to experience the use of robotics in the classroom is another way to enhance the creative and computational thinking of the students, which is based on the cognitive processes of higher order thinking skills.
(3)
The keyword used to describe the robot programming skill in all 16 indicators corresponds to the skill that represents the higher order thinking of Anderson et al. [27], together with seven empirical studies [47,48,49,50,51,52,53] in the scoping review.
(4)
The scale for evaluating the robot programming skills was defined as a four-point scale consistent with Marzano and Kendall [80], who established the standardized measurement methods for the assessments of the cognitive domain.

7. Conclusions

The robot programming skill assessments based on higher order thinking consisting of three components with 16 indicators are shown in Table 6. All components and indicators were accepted by the validity assessment. In addition, the reliability analysis indicated that both the individual components and the overall indicators demonstrated acceptable reliability. Thus, it was concluded that the components and indicators synthesized in this study could be used as a guide for measuring the robot programming skills based on higher order thinking.

8. Suggestion

It is commonly known in the field of education that the elements of learning or lesson design consist of three main parts that include: (1) learning objectives, (2) learning and teaching strategies and (3) learning outcome evaluations [81]. Consequently, this research is one of the approaches that learning designers can use for learning activities that focus on developing higher order thinking skills by using robot programming as a tool. They can apply the results of this research to teaching and learning management for obtaining valid and accurate learning outcomes. It also includes teaching strategy designs that are consistent with the learning objectives.
However, the components and indicators developed in this study are defined within the conceptual framework of the synthesis of generic programming processes and are based on the higher order thinking skills dimension of Anderson et al. [27]. As a result, those implementing the findings of this study should first consider their compatibility within the context of grounded theory in learning design. Furthermore, the experimental test for created indicators was only the first trial with 50 volunteers in the Department of Skill Development of Thailand’s robot programming skills training program. Therefore, the researchers advise those who use the findings of this study to consider the experimental group’s condition.
Finally, the researchers hope that applying the components and indicators developed in this research to expand the results by experimenting with a variety of numbers and demographic characteristic samples will increase the credibility of the robot programming assessment of these indicators. These will lead to future benchmarks for the evaluation of robot programming skills.

Author Contributions

C.L., S.M. and M.E. contributed to the design and implementation of the research, and the analysis of the results. C.L. developed the theory and performed the computations. S.M. and M.E. verified the analytical methods and supervised the findings of this work. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Ethics approval was not required in this research.

Informed Consent Statement

All participants consented to participate in this study. They were given the opportunity to ask questions regarding the study and the collection of the assessment scores. They received answers to any questions regarding the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

This research article was supported by the “Petch Pra Jom Klao Research Scholarship” from King Mongkut’s University of Technology Thonburi.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Alimisis, D. Educational robotics: Open questions and new challenges. Themes Sci. Technol. Educ. 2013, 6, 63–71. Available online: http://earthlab.uoi.gr/theste/index.php/theste/article/view/119 (accessed on 3 February 2022).
  2. OECD. SKILLS FOR 2030. OECD Future of Education and Skills 2030 Concept Note. 2019. Available online: www.oecd.org/education/2030-project (accessed on 29 December 2021).
  3. Almerich, G.; Suárez-Rodríguez, J.; Díaz-García, I.; Cebrián-Cifuentes, S. 21st-century competences: The relation of ICT competences with higher-order thinking capacities and teamwork competences in university students. J. Comput. Assist. Learn. 2020, 36, 468–479. [Google Scholar] [CrossRef]
  4. Conklin, W. Higher-Order Thinking Skills to Develop 21st Century Learners; Shell Education: Huntington Beach, CA, USA, 2012. [Google Scholar]
  5. Widiawati, L.; Joyoatmojo, S.; Sudiyanto, S. Higher order thinking skills as effect of problem-based learning in the 21st century learning. Int. J. Multicult. Multireligious Underst. 2018, 5, 96–105. [Google Scholar]
  6. Hafni, R.N.; Nurlaelah, E. 21st Century Learner: Be a Critical Thinking. In Proceedings of the Second of International Conference on Education and Regional Development 2017 (ICERD 2nd), Bandung, Indonesia, 20–21 November 2017; Volume 1. [Google Scholar]
  7. Taylor, A.T.; Berrueta, T.A.; Murphey, T.D. Active learning in robotics: A review of control principles. Mechatronics 2021, 77, 102576. [Google Scholar] [CrossRef]
  8. Jean, A. A brief history of artificial intelligence. Medecine/Sciences 2020, 36, 1059–1067. [Google Scholar] [CrossRef]
  9. Mouha, R.A. Deep Learning for Robotics. J. Data Anal. Inf. Process. 2021, 9, 63–76. [Google Scholar] [CrossRef]
  10. Saukkoriipi, J.; Heikkilä, T.; Ahola, J.M.; Seppälä, T.; Isto, P. Programming and control for skill-based robots. Open Eng. 2020, 10, 368–376. [Google Scholar] [CrossRef]
  11. Herrero, H.; Moughlbay, A.A.; Outón, J.L.; Sallé, D.; de Ipiña, K.L. Skill based robot programming: Assembly, vision and Workspace Monitoring skill interaction. Neurocomputing 2017, 255, 61–70. [Google Scholar] [CrossRef]
  12. Cheah, C.-S. Factors Contributing to the Difficulties in Teaching and Learning of Computer Programming: A Literature Review. Contemp. Educ. Technol. 2020, 12, ep272. [Google Scholar] [CrossRef]
  13. Durak, H.Y.; Yilmaz, F.G.K.; Yilmaz, R. Computational Thinking, Programming Self-Efficacy, Problem Solving and Experiences in the Programming Process Conducted with Robotic Activities. Contemp. Educ. Technol. 2019, 10, 173–197. [Google Scholar] [CrossRef]
  14. Abadi, M.; Plotkin, G.D. A simple differentiable programming language. Proc. ACM Program. Lang. 2020, 4, 1–28. [Google Scholar] [CrossRef] [Green Version]
  15. Lertyosbordin, C.; Maneewan, S.; Srikaew, D. Components and Indicators of Problem-solving Skills in Robot Programming Activities. Int. J. Adv. Comput. Sci. Appl. 2021, 12, 132–140. [Google Scholar] [CrossRef]
  16. Department of Computer Science and Statistics. Computer Programming. The University of Rhode Island. 2020. Available online: https://homepage.cs.uri.edu/faculty/wolfe/book/Readings/Reading13.htm (accessed on 3 February 2022).
  17. Amjo. Six Steps in the Programming Process. Dotnet Languages. 30 June 2018. Available online: https://www.dotnetlanguages.net/six-steps-in-the-programming-process/ (accessed on 3 February 2022).
  18. Jorge Valenzuela. Computer Programming in 4 Steps. ISTE. International Society for Technology in Education (ISTE). 20 March 2018. Available online: https://www.iste.org/explore/Computer-Science/Computer-programming-in-4-steps (accessed on 3 February 2022).
  19. School of Computer Science. The Programming Process. University of Birmingham. Available online: https://www.cs.bham.ac.uk/~rxb/java/intro/2programming.html (accessed on 3 February 2022).
  20. Wikibooks. The Computer Revolution/Programming/Five Steps of Programming—Wikibooks, Open Books for an Open World. Wikibooks. 2021. Available online: https://en.m.wikibooks.org/wiki/The_Computer_Revolution/Programming/Five_Steps_of_Programming (accessed on 3 February 2022).
  21. Sharma, P.; Singh, D. Comparative Study of Various SDLC Models on Different Parameters. Int. J. Eng. Res. 2015, 4, 188–191. [Google Scholar] [CrossRef] [Green Version]
  22. Commons, M.L.; Crone-Todd, D.; Chen, S.J. Using SAFMEDS and direct instruction to teach the model of hierarchical complexity. Behav. Anal. Today 2014, 14, 31–45. [Google Scholar] [CrossRef]
  23. Lysaker, P.H.; Buck, K.D.; Carcione, A.; Procacci, M.; Salvatore, G.; Nicolò, G.; Dimaggio, G. Addressing metacognitive capacity for self reflection in the psychotherapy for schizophrenia: A conceptual model of the key tasks and processes. Psychol. Psychother. Theory Res. Pract. 2010, 84, 58–69. [Google Scholar] [CrossRef] [Green Version]
  24. Mahoney, M.J.; Kazdin, A.E.; Lesswing, M.J. Behavior modification: Delusion or deliverance? In Annual Review of Behavior Therapy: Theory & Practice; Franks, C.M., Wilson, G.T., Eds.; Brunner/Mazel: New York, NY, USA, 1974. [Google Scholar]
  25. Ardini, S.N. Teachers’ Perception, Knowledge and Behaviour of Higher Order Thinking Skills (HOTS). Eternal Engl. Teach. J. 2018, 8, 20–33. [Google Scholar] [CrossRef]
  26. Bloom, B.S.; Engelhart Max, D.; Furst Edward, J.; Hill Walker, H.; Krathwohl, D.R. Taxonomy of Educational Objectives: The Classification of Educational Goals; Edwards Bros.: Ann Arbor, MI, USA, 1956. [Google Scholar]
  27. Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives; Longman: New York, NY, USA, 2001. [Google Scholar]
  28. Miho, T.; Katja, A. Future Shocks and Shifts: Challenges for the Global Workforce and Skills Development. April 2017. Available online: https://www.oecd.org/education/2030-project/about/documents/Future-Shocks-and-Shifts-Challenges-for-the-Global-Workforce-and-Skills-Development.pdf (accessed on 29 December 2021).
  29. Zion Market Research. Robot Software Market—Global Industry Analysis. Zion Market Research. 21 November 2019. Available online: https://www.zionmarketresearch.com/report/robot-software-industry (accessed on 29 December 2021).
  30. Stacie, S. “In Memory: Seymour Papert,” MIT Media Lab. 20 January 2017. Available online: https://www.media.mit.edu/posts/in-memory-seymour-papert/ (accessed on 29 December 2021).
  31. Master, A.; Cheryan, S.; Moscatelli, A.; Meltzoff, A. Programming experience promotes higher STEM motivation among first-grade girls. J. Exp. Child Psychol. 2017, 160, 92–106. [Google Scholar] [CrossRef] [Green Version]
  32. Mcdonald, C.V. STEM Education: A review of the contribution of the disciplines of science, technology, engineering, and mathematics. Sci. Educ. Int. 2016, 27, 530–569. [Google Scholar]
  33. Jeong, H.; Hmelo-Silver, C.E.; Jo, K. Ten years of Computer-Supported Collaborative Learning: A meta-analysis of CSCL in STEM education during 2005–2014. Educ. Res. Rev. 2019, 28, 100284. [Google Scholar] [CrossRef]
  34. Yücelyiğit, S.; Toker, Z. A meta-analysis on STEM studies in early childhood education. Turk. J. Educ. 2021, 10, 23–36. [Google Scholar] [CrossRef]
  35. Daintith, J.; Wright, E. Robotics; Oxford University Press: New York, NY, USA, 2008. [Google Scholar] [CrossRef]
  36. Butterfield, A.; Ngondi, G.E.; Kerr, A. Programming; Oxford University Press: New York, NY, USA, 2016. [Google Scholar] [CrossRef]
  37. Schumacher, J.; Welch, D.; Raymond, D. Teaching introductory programming, problem solving and information technology with robots at West Point. In Proceedings of the Frontiers in Education Conference, Reno, NV, USA, 10–13 October 2001; Volume 2, pp. F1B/2–F1B/7. [Google Scholar] [CrossRef]
  38. Jawawi, D.N.A.; Mamat, R.; Ridzuan, F.; Khatibsyarbini, M.; Zaki, M.Z.M. Introducing Computer Programming to Secondary School Students Using Mobile Robots. In Proceedings of the 10th Asian Control Conference (ASCC2015), Kota Kinabalu, Malaysia, 31 May–3 June 2015; pp. 1–6. [Google Scholar] [CrossRef]
  39. Sharma, M.K. A study of SDLC to develop well engineered software. Int. J. Adv. Res. Comput. Sci. 2017, 8, 520–523. [Google Scholar]
  40. Suryantara, I.G.N.; Andry, J.F. Development of Medical Record with Extreme Programming SDLC. IJNMT Int. J. New Media Technol. 2018, 5, 47–53. [Google Scholar] [CrossRef]
  41. Pambudi, W.S.; Suheta, T. Implementation of Fuzzy-PD for Folding Machine Prototype Using LEGO EV3. TELKOMNIKA Telecommun. Comput. Electron. Control. 2018, 16, 1625–1632. [Google Scholar] [CrossRef]
  42. Jung, H.-W. A study on basic software education applying a step-by-step blinded programming practice. J. Digit. Converg. 2019, 17, 25–33. [Google Scholar] [CrossRef]
  43. Agamawi, Y.M.; Rao, A.V. CGPOPS: A C++ Software for Solving Multiple-Phase Optimal Control Problems Using Adaptive Gaussian Quadrature Collocation and Sparse Nonlinear Programming. ACM Trans. Math. Softw. (TOMS) 2020, 46, 1–38. [Google Scholar] [CrossRef]
  44. Surfing Scratcher. Assessment Rubric for Coding. 2 September 2019. Available online: https://surfingscratcher.com/assessment-rubric-for-coding/ (accessed on 7 February 2022).
  45. Patrick. Griffin, Assessment for Teaching; Cambridge University Press: Cambridge, UK, 2017; Available online: https://books.google.com/books/about/Assessment_for_Teaching.html?hl=th&id=4i42DwAAQBAJ (accessed on 7 February 2022).
  46. Schraw, G.J.; Robinson, D.H. Assessment of Higher Order Thinking Skills; Information Age Pub.: Charlotte, NC, USA, 2011. [Google Scholar]
  47. Paglia, F.L.; Francomano, M.M.; Riva, G.; Barbera, D.L. Educational robotics to develop executive functions visual spatial abilities, planning and problem solving. Annu. Rev. CyberTherapy Telemed. 2018, 16, 80–86. [Google Scholar]
  48. Lertyosbordin, C.; Maneewan, S.; Nittayathammakul, V. Development of training model on robot programming to enhance creative problem–solving and collaborative learning for mathematics–science program students. J. Thai Interdiscip. Res. 2018, 13, 61–66. Available online: https://ph02.tci-thaijo.org/index.php/jtir/article/view/126274/95463 (accessed on 7 April 2022).
  49. Hu, C.C.; Tseng, H.T.; Chen, M.H.; Alexis, G.P.I.; Chen, N.S. Comparing the effects of robots and IoT objects on STEM learning outcomes and computational thinking skills between programming-experienced learners and programming-novice learners. In Proceedings of the IEEE 20th International Conference on Advanced Learning Technologies, ICALT 2020, Tartu, Estonia, 6–9 July 2020; pp. 87–89. [Google Scholar] [CrossRef]
  50. Kim, S.U. A Comparative Study on the Effects of Hands-on Robot and EPL Programming Activities on Creative Problem-Solving Ability in Children. In Proceedings of the ACM International Conference Proceeding Series, Singapore, Singapore, 15–18 May 2020; pp. 49–53. [Google Scholar] [CrossRef]
  51. Çınar, M.; Tüzün, H. Comparison of object-oriented and robot programming activities: The effects of programming modality on student achievement, abstraction, problem solving, and motivation. J. Comput. Assist. Learn. 2021, 37, 370–386. [Google Scholar] [CrossRef]
  52. Angeli, C. The effects of scaffolded programming scripts on pre-service teachers’ computational thinking: Developing algorithmic thinking through programming robots. Int. J. Child-Comput. Interact. 2022, 31, 100329. [Google Scholar] [CrossRef]
  53. Sarı, U.; Pektaş, H.M.; Şen, Ö.F.; Çelik, H. Algorithmic thinking development through physical computing activities with Arduino in STEM education. Educ. Inf. Technol. 2022, 1–21. [Google Scholar] [CrossRef]
  54. Resnick, L.B. Education and Learning to Think; National Academy Press: Washington, DC, USA, 1987. [Google Scholar]
  55. Lewis, A.; Smith, D. Defining higher order thinking. Theory Into Pract. 1993, 32, 131–137. [Google Scholar] [CrossRef]
  56. King, F.; Goodson, L.; Rohani, F. Higher Order Thinking Skills • Definition • Teaching Strategies • Assessment; Educational Services Program: Tallahassee, FL, USA, 1998. [Google Scholar]
  57. Broadfoot, P.; Black, P. Redefining assessment? The first ten years of assessment in education. In Assessment in Education: Principles, Policy & Practice; Taylor & Francis: Oxfordshire, UK, 2004; Volume 11, pp. 7–26. [Google Scholar] [CrossRef]
  58. Craddock, D.; Mathias, H. Assessment options in higher education. Assess. Eval. High. Educ. 2009, 34, 127–140. [Google Scholar] [CrossRef]
  59. Corliss, S.; Linn, M. Assessing learning from inquiry science instruction. In Assessment of Higher Order Thinking Skills; Schraw, G., Robinson, D.H., Eds.; Information Age Pub.: Charlotte, NC, USA, 2011; pp. 219–243. [Google Scholar]
  60. Selby, C.C. Relationships: Computational thinking, pedagogy of programming, and Bloom’s Taxonomy. In Proceedings of the Workshop in Primary and Secondary Computing Education, London, UK, 9–11 November 2015; pp. 80–87. [Google Scholar] [CrossRef]
  61. Leighton, J.P. A cognitive model for the assessment of higher order thinking in students. In Assessment of Higher Order Thinking Skills; Information Age Pub.: Charlotte, NC, USA, 2011; pp. 151–181. [Google Scholar]
  62. Likert, R. A Technique for the Measurement of Attitudes; Archives of Psychology; The Science Press: New York, NY, USA, 1932; Volume 22. [Google Scholar]
  63. MARA. Announcement for Special Projects No Training Costs with Snacks and Lunch. 24 July 2021. Available online: https://www.dsd.go.th/mara/Region/ShowACT/72089?region_id=23&cat_id=3 (accessed on 6 April 2022).
  64. Rodrigues, I.B.; Adachi, J.D.; Beattie, K.A.; MacDermid, J.C. Development and validation of a new tool to measure the facilitators, barriers, and preferences to exercise in people with osteoporosis. BMC Musculoskelet. Disord. 2017, 18, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Zach. How to Calculate Cronbach’s Alpha in R (With Examples). Statology. 28 March 2021. Available online: https://www.statology.org/cronbachs-alpha-in-r/ (accessed on 3 February 2022).
  66. Xu, Q.; Heller, K.; Hsu, L.; Aryal, B. Authentic assessment of students’ problem solving. AIP Conf. Proc. 2013, 1513, 434. [Google Scholar] [CrossRef] [Green Version]
  67. Gao, X.; Grisham-Brown, J. The Use of Authentic Assessment to Report Accountability Data on Young Children’s Language, Literacy and Pre-math Competency. Int. Educ. Stud. 2011, 4, 41. [Google Scholar] [CrossRef]
  68. Rovinelli, R.J.; Hambleton, R.K. On the Use of Content Specialists in the Assessment of Criterion-Referenced Test Item Validity. San Francisco. April 1976. Available online: https://eric.ed.gov/?q=Rovinelli&id=ED121845 (accessed on 10 March 2022).
  69. Müller, C.M.; de Vos, R.A.I.; Maurage, C.A.; Thal, D.R.; Tolnay, M.; Braak, H. Staging of Sporadic Parkinson Disease-Related α-Synuclein Pathology: Inter- and Intra-Rater Reliability. J. Neuropathol. Exp. Neurol. 2005, 64, 623–628. [Google Scholar] [CrossRef]
  70. Stephanie Glen. Cronbach’s Alpha: Definition, Interpretation, SPSS. StatisticsHowTo. 2022. Available online: https://www.statisticshowto.com/probability-and-statistics/statistics-definitions/cronbachs-alpha-spss/ (accessed on 10 March 2022).
  71. Taber, K.S. The Use of Cronbach’s Alpha When Developing and Reporting Research Instruments in Science Education. Res. Sci. Educ. 2018, 48, 1273–1296. [Google Scholar] [CrossRef]
  72. Burry-Stock, J.A.; Shaw, D.G.; Laurie, C.; Chissom, B.S. Rater Agreement Indexes for Performance Assessment. Educ. Psychol. Meas. 2016, 56, 251–262. [Google Scholar] [CrossRef]
  73. Segal, D.; Chen, P.Y.; Gordon, D.A.; Kacir, C.D.; Gylys, J. Development and Evaluation of a Parenting Intervention Program: Integration of Scientific and Practical Approaches. Int. J. Hum.–Comput. Interact. 2010, 15, 453–467. [Google Scholar] [CrossRef]
  74. Hadzhikoleva, S.; Hadzhikolev, E.; Kasakliev, N. Using peer assessment to enhance Higher Order thinking skills. TEM J. 2019, 8, 242–247. [Google Scholar] [CrossRef]
  75. APA Dictionary of Psychology. Problem-Solving Approach. 2022. Available online: https://dictionary.apa.org/problem-solving-approach (accessed on 8 March 2022).
  76. Jonassen, D.H. Learning to Solve Problems: A Handbook for Designing Problem-Solving Learning Environments; Taylor & Francis: Abingdon, UK, 2010; Available online: https://www.routledge.com/Learning-to-Solve-Problems-A-Handbook-for-Designing-Problem-Solving-Learning/Jonassen/p/book/9780415871945 (accessed on 8 March 2022).
  77. Chandrasekaran, B. Design Problem Solving: A Task Analysis. AI Mag. 1990, 11, 59. [Google Scholar] [CrossRef]
  78. Matsun; Boisandi; Sari, I.N.; Hadiati, S.; Saputri, D.F. The effect of physics learning using ardouno uno based media on higher-order thinking skills. J. Phys. Conf. Ser. 2021, 2104, 012014. [Google Scholar] [CrossRef]
  79. Avello-Martínez, R.; Lavonem, J.; Zapata-Ros, M. Codificación y robótica educativa y su relación con el pensamientocomputacional y creativo. Una revisión compresiva. Rev. De Educ. A Distancia (RED) 2020, 20, 1–21. [Google Scholar] [CrossRef] [Green Version]
  80. Marzano, R.J.; Kendall, J.S. Designing & Assessing Educational Objectives; Corwin Press: Thousand Oaks, CA, USA, 2008. [Google Scholar]
  81. Eberly Center. Align Assessments, Objectives, Instructional Strategies. Carnegie Mellon University. 2022. Available online: https://www.cmu.edu/teaching/assessment/basics/alignment.html (accessed on 10 March 2022).
Figure 1. Computational thinking, pedagogy of programming, and Bloom’s old Taxonomy.
Figure 1. Computational thinking, pedagogy of programming, and Bloom’s old Taxonomy.
Asi 05 00047 g001
Figure 2. Division of the scale.
Figure 2. Division of the scale.
Asi 05 00047 g002
Figure 3. Literature search and selecting flow.
Figure 3. Literature search and selecting flow.
Asi 05 00047 g003
Table 1. Comparison of the old and new cognitive domains.
Table 1. Comparison of the old and new cognitive domains.
Thinking OrderingOld Cognitive Domain [26]Revision Cognitive Domain [27]
LowKnowledgeRemember
ComprehensionUnderstand
ApplicationApply
HighAnalysisAnalyze
SynthesisEvaluate
EvaluationCreate
Table 2. Thinking skills in scientific learning activities.
Table 2. Thinking skills in scientific learning activities.
LevelScience SkillsLearning Activities/Assessment
LowDemonstrating knowledge of scientific concepts, laws, theories, procedures and instrumentsRecall
Define
Describe
List
Identify
HighApplying scientific knowledge and procedures to solve complex problemsFormulate questions
Hypothesize/predict
Design investigations
Use model
Compare/contrast/classify
Analyze
Find solutions
Interpret
Integrate/synthesize
Relate
Evaluate
Table 3. Higher order thinking behaviors.
Table 3. Higher order thinking behaviors.
DimensionAnalyzeEvaluateCreate
FactualSelectCheckGenerate
ConceptualRelate DetermineAssemble
ProceduralDifferentiateConcludeCompose
MetacognitiveDeconstructReflectActualize
Table 4. Study context characteristics.
Table 4. Study context characteristics.
Paper_IDYearStudy EnvironmentRegionGender
La Paglia et al. [47]2018Elementary schoolItalyMixed
Lertyosbordin et al. [48]2018Middle schoolThailandMixed
Hu et al. [49]2020Higher EducationTaiwanNot available
Kim [50]2020Elementary schoolRepublic of KoreaMixed
Çınar and Tüzün [51]2021High schoolTurkeyMixed
Angeli [52]2022Higher EducationCyprusMixed
Sari et al. [53]2022Higher EducationTurkeyMixed
Table 5. Research design characteristics.
Table 5. Research design characteristics.
Paper_IDResearch DesignSample DesignSample SizeManipulate VariableDependent Variable
La Paglia et al. [47]Two-group pre-test & post-testRandom30 people (group 1: 15; group 2: 15)Robot programming activities Higher order thinking includes: forecasting, planinng, and problem solving
Lertyosbordin [48]One-group pre-test & post-testRandom40 peopleRobot programming activities Creative problem-solving skills include: problem analysis, finding a solution and robot testing
Hu et al. [49]Two-group post-testPurposive13 people (group 1: 6; group 2: 6) Robots and IoT programming coursesComputational-thinking learning outcome
Kim [50]Two-group pre-test & post-testPurposive45 people (group 1: 22; group 2: 23)Hands-on robot and EPL programming activitiesCreative problem solving includes: understanding the problem, generating ideas planning for action and an evaluation
Çınar and Tüzün [51]Two-group pre-test & post-testPurposive81 people (group 1: 41; group 2: 40)Object-oriented and robot programming activitiesAchievement, abstraction, problem solving and motivation
Angeli [52]One-group pre-test & post-testPurposive50 peopleRobot programming activities Computaional thinkinig skills include: skills of sequencing, flow of control, and debugging
Sari et al. [53]One-group pre-test & post-testPurposive24 peopleArduino coding activitiesAlgorithmic-thinking skills include: understanding the problem, determining the solution strategies and creating the algorithm
Table 6. The synthesis of the components and indicators of the robot programming skill assessment based on higher order thinking.
Table 6. The synthesis of the components and indicators of the robot programming skill assessment based on higher order thinking.
ComponentsIndicatorsEvidence-Based References
[47][48][49][50][51][52][53]
1. The ability to solve problems step by step1. Describe the problem and the sequence of ways to solve it.
2. Draw the flowcharts or pseudocodes to show the sequence of ways to solve problems.
3. Change the sequence of steps if the results are not achieved.
4. Tackle the tasks presented by breaking them down into smaller tasks.
5. Capture the issues that can cause problems to repeat.
2. The ability to create computer programs6. Create a program using a computer language from a blank page.
7. Create a program with a single-decision condition.
8. Create a program with the nested structure of decision conditions.
9. Create a variable to control the loop task programs.
10. Create a variable and input data that affect the output.
11. Build your own program from the beginning, until you achieve the objectives.
12. Create a function that can modify parameters.
3. The ability to connect to the robot13. Connect the port between the computer and the microcontroller.
14. Create objects for using analog and/or digital signals.
15. Create a graphical user interface (GUI) to display the analog and/or digital inputs.
16. Create a graphical user interface (GUI) for the digital outputs.
Level of consistency: ○ Not at all, ◔ Slightly, ◑ Moderately, ◕ Very, ● Extremely.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Lertyosbordin, C.; Maneewan, S.; Easter, M. Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking. Appl. Syst. Innov. 2022, 5, 47. https://doi.org/10.3390/asi5030047

AMA Style

Lertyosbordin C, Maneewan S, Easter M. Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking. Applied System Innovation. 2022; 5(3):47. https://doi.org/10.3390/asi5030047

Chicago/Turabian Style

Lertyosbordin, Chacharin, Sorakrich Maneewan, and Matt Easter. 2022. "Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking" Applied System Innovation 5, no. 3: 47. https://doi.org/10.3390/asi5030047

APA Style

Lertyosbordin, C., Maneewan, S., & Easter, M. (2022). Components and Indicators of the Robot Programming Skill Assessment Based on Higher Order Thinking. Applied System Innovation, 5(3), 47. https://doi.org/10.3390/asi5030047

Article Metrics

Back to TopTop