Next Article in Journal
A Peak Absorption Filtering Method for Radiated EMI from a High-Speed PWM Fan
Next Article in Special Issue
ICT Adoption in Education: Unveiling Emergency Remote Teaching Challenges for Students with Functional Diversity Through Topic Identification in Modern Greek Data
Previous Article in Journal
An Analysis of the Factors Influencing Dual Separation Zones on a Plate
Previous Article in Special Issue
Retrieval-Augmented Generation (RAG) Chatbots for Education: A Survey of Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Teaching Artificial Intelligence and Machine Learning in Secondary Education: A Robotics-Based Approach

by
Georgios Karalekas
,
Stavros Vologiannidis
* and
John Kalomiros
Department of Computer, Informatics and Telecommunications Engineering, International Hellenic University, 62124 Serres, Greece
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(8), 4570; https://doi.org/10.3390/app15084570
Submission received: 9 March 2025 / Revised: 12 April 2025 / Accepted: 15 April 2025 / Published: 21 April 2025
(This article belongs to the Special Issue ICT in Education, 2nd Edition)

Abstract

:
The rapid advancement of Artificial Intelligence (AI) and Machine Learning (ML) highlights the need for innovative, engaging educational approaches in secondary education. This study presents the design and classroom implementation of a robotics-based lesson aimed at introducing core AI and ML concepts to ninth-grade students without prior programming experience. The intervention employed two low-cost, 3D-printed robots, each used to illustrate a different aspect of intelligent behavior: (1) rule-based automation, (2) supervised learning using image classification, and (3) reinforcement learning. The lesson was compared with a previous implementation of similar content delivered through software-only activities. Data were collected through classroom observation and student–teacher discussions. The results indicated increased student engagement and enthusiasm in the robotics-based version, as well as improved conceptual understanding. The approach required no specialized hardware or instructor expertise, making it easily adaptable for broader use in school settings.

1. Introduction

1.1. Motivation

The integration of artificial intelligence (AI) and machine learning (ML) education into the school environment is a necessity driven by rapid technological advancements. Educating students on the basic concepts and capabilities of these technologies is essential for improving their technical skills and fostering critical and creative thinking about the role of artificial intelligence in society [1,2]. Understanding that AI and robotics are tools rather than threats can help students see their potential for collaboration and adaptation in a rapidly evolving technological landscape [3].
The educational process can be enriched through multimedia, such as videos and interactive simulations, but engaging directly with robots provides a more immersive experience. Observing how robots learn through reinforcement learning, neural networks, and natural language processing provides students with concrete insights into AI applications [4]. These applications demonstrate how robots function in various environments, from industry and medicine to education and social care [5].

1.2. Robotics in STEM Education

The employment of robotic systems in learning techniques is particularly valuable in STEM (Science, Technology, Engineering, Mathematics) education [6]. Robots provide real-time interaction and feedback, making learning more engaging compared to traditional methods [7]. Constructing, programming, and experimenting with robots helps students develop technical and cognitive skills essential for AI literacy [8].
Practical engagement with robotics not only enhances students’ understanding of AI and ML but also fosters problem-solving, collaboration, and creativity [8]. Hands-on interaction can also boost confidence and inspire curiosity, encouraging students to explore AI-related fields further [9].
A well-structured introduction is crucial for all lessons because it establishes student engagement. Research indicates that factors such as personality, emotional disposition, and motivation influence students’ ability to connect with new material [10]. Given that today’s students are “digital natives” [11], innovative teaching methods are required to maintain their interest, especially in AI and ML subjects. Robotics serves as an effective tool for introducing AI and ML concepts in an interactive and meaningful way.

1.3. Purpose and Objectives

This article explores a novel approach designed to enhance student engagement and comprehension when teaching AI and ML concepts. The lesson plan includes two simple 3D-printed educational robots, capable of performing everyday tasks. The robots are programmable using free software that does not require prior programming knowledge from either students or teachers. They are designed to be cost-effective and easy to construct, with freely available files and usage instructions, making them accessible even in educational environments with limited resources. The lesson begins with the presentation of the two robots, accompanied by short videos demonstrating their real-world applications. Students then engage in hands-on interaction with the robots. The lesson is structured around three key stages: (1) programming robots to perform repetitive movements without the use of machine learning, (2) training robots to recognize human prompts for interaction through a trained ML model, and (3) simulating autonomy using machine learning techniques such as reinforcement learning to help students distinguish between simple AI applications and advanced ML processes. The teaching approach was tested in practice and evaluated through experimental implementation in classroom settings.
The rest of the article provides a detailed presentation of the relevant research literature in Section 2, highlighting the differences and improvements introduced by the present study compared to existing approaches. In Section 3, the description of the robots is presented, along with the teaching methodology and the experimental details. Subsequently, in Section 4, the main observations, as recorded during the experimental implementation, are presented. Section 5 includes the interpretation of the findings and the discussion of the results, while Section 6 summarizes the main conclusions of the study and points out suggestions for future research and improvements.

2. Related Work

The use of robots for teaching concepts of machine learning (ML) and artificial intelligence (AI) has been the subject of extensive research, with various approaches highlighting the advantages and challenges of this educational practice. Robotic kits have been used for the hands-on familiarization of primary and secondary education students with basic ML concepts, enhancing their engagement through practical activities. For example, the educational robot Robobo allows students to program a “robotic pet” that physically interacts with users, helping them understand human–robot interaction and the basic principles of artificial intelligence [12]. While such approaches facilitate understanding, they require significant investment in equipment and proper preparation by educators [13].
Additionally, the use of virtual robotic platforms, such as ARtonomous, provides students with the opportunity to understand reinforcement learning (RL) through simulations. These platforms reduce the costs and risks associated with using physical robots, but the lack of physical interaction may limit students’ experience [14]. Other research has explored the use of educational robotics to introduce ML and AI concepts into K–12 curricula, enhancing students’ familiarity with cutting-edge technologies. However, the complexity of these technologies may pose challenges for beginner-level students [15].
The use of autonomous robots for the practical study of basic AI algorithms, such as Learning Automata, has also been examined. These robots offer students the chance to experiment in real-world conditions but require a higher level of technical expertise, which may not be accessible to all [16]. Moreover, robots with AI capabilities have been proposed as tools for promoting AI literacy, providing activities that combine human–robot interaction with the understanding of ethical issues. However, the development and maintenance costs of such systems remain significant barriers [17].
Finally, the use of cloud-based digital twins allows interaction with simulated robots, reducing the need for physical equipment. While this approach facilitates access to advanced technologies, it depends on the availability of high-quality internet connectivity, which can be a limiting factor [18]. Additionally, students can develop their own robotics projects using the Robot Operating System (ROS), which provides a platform for creating applications ranging from simple mobile robots to complex industrial arms [19]. In different reviews of using robots in educational processes [20,21,22], or even simulations on computers without physical robotic systems [23], the robots used are usually singular, limiting their suitability for different machine learning examples within the narrow framework of a school lesson. Overall, these studies highlight the potential of educational robotics in teaching ML and AI. Robots offer a practical and interactive way to introduce students to cutting-edge concepts such as reinforcement learning, automated decision-making systems, and the understanding of algorithms underpinning artificial intelligence. Furthermore, several studies have emphasized the role of embodiment and human–robot interaction in shaping how students perceive and engage with robotic systems. The robot is perceived not merely as a machine but as a character with which they can interact, thus enhancing student engagement and emotional involvement in the learning process [24,25,26,27]. However, these benefits come with challenges that require attention. Equipment and infrastructure costs can be barriers for schools with limited resources, restricting access to this technology. The technical infrastructure requires specialized staff and support, while the complexity of the robots and technologies they incorporate may deter beginner students. Additionally, students are not exposed to applications that, despite appearing intelligent, do not utilize machine learning techniques, limiting their understanding of what constitutes machine learning. The educational use of robots alongside robots employed in real-world or industrial applications is also restricted. In many classroom robotics applications, autonomous robots are used that require specially designed classrooms with spaces where they can move, which is uncommon in school classrooms or labs. As a result, the educational community needs to find ways to reduce these barriers, making robotics technologies more accessible and adaptable to the needs and capabilities of a broader range of students.
In response to the aforementioned challenges, the proposed educational approach introduces a novel and accessible framework for teaching AI and ML in secondary education. Unlike most existing studies that rely on a single robot or virtual platforms, this methodology employs two distinct physical robots, each designed to highlight different concepts and use cases. This multi-robot strategy offers students the opportunity to compare multiple types of intelligent behavior in practice, leading to a deeper and more nuanced understanding of the underlying principles. In addition, the proposed lesson is carefully designed to overcome typical barriers such as high cost, technical complexity, and the need for prior programming or mathematical knowledge. It utilizes low-cost, open-source components, printable 3D parts, and freely available web-based ML tools, allowing both students and educators to engage with the content regardless of prior expertise or infrastructure. This combination of accessibility, modularity, and continuity beyond the classroom directly addresses limitations identified in previous work and expands the potential reach of AI and ML education across diverse school settings.

3. Methods

During the previous school year, a course was designed and implemented for middle school students on artificial intelligence and machine learning. The course consisted of a theoretical part, where students were introduced to the fundamental concepts of AI and ML, and a hands-on laboratory part, where they trained machine learning models using the online platform ML4Kids [28]. They developed models capable of recognizing different images and sounds. Subsequently, students utilized the trained models in applications they created themselves using the Scratch [29] programming language. Although the course initially captured students’ interest, it was observed that many of them gradually lost engagement both in the theoretical section and during the laboratory activities.
Drawing on prior experience in teaching robotics and designing robots [30,31], a new educational approach was developed that integrated robotics into the teaching of artificial intelligence concepts, with a focus on machine learning. After reviewing existing market options and the relevant literature [32], two robots were designed and developed for use in an experimental course conducted in a school setting. This approach addressed key challenges in teaching artificial intelligence (AI) and machine learning (ML), making these concepts more accessible and applicable for students and educators.
Specifically, the lesson is designed so that students do not require prior programming knowledge, as all necessary code consists of open-source software and is provided free of charge, ensuring ease of use and unrestricted access. Additionally, the machine learning application used is freely available online, allowing students to continue exploring the subject outside the classroom, even without the presence of the robot.
At the same time, the lesson does not require advanced mathematical knowledge, making it accessible to a broad student audience regardless of their academic background. The robots used in the lesson are built from affordable and easily accessible materials, which can be readily found in the market, while their plastic components can be produced using 3D printers, with all necessary design files provided for printing. This enables schools to implement the lesson without incurring high costs.
Furthermore, the lesson is designed to be taught by educators without extensive knowledge of AI and ML, making it accessible to teachers who wish to introduce students to these concepts without requiring prior specialization. These factors significantly contribute to increasing accessibility to such lessons, which are often absent from standard curricula due to their complexity or infrastructure and knowledge requirements.
Another key feature of the lesson is its extensive introductory segment, which is designed to capture students’ interest and keep them engaged throughout the learning process. The use of robots as a teaching tool significantly enhances students’ attention and focus compared to traditional teaching methods that lack physical interaction. Additionally, the structure of the lesson helps students understand the fundamental difference between simple data entry and actual machine learning, providing a clear distinction between pre-programmed actions and processes based on machine learning algorithms.
Finally, a major advantage of the proposed approach is that the robots used do not require specially designed classrooms or dedicated spaces for movement, allowing them to be easily integrated into any educational environment, regardless of available facilities. This flexibility facilitates the implementation of the lesson in different schools and educational settings, promoting its widespread adoption and use among a diverse range of students. The robots are described in detail in the following paragraphs.

3.1. Using a Robot Arm in Class (Arduino Mini Manipulator)

The first robot (Arduino Mini Manipulator) is based on the Arduino Mega microcontroller, facilitating programming and integration with various sensors and actuators. The robot is the orange robotic arm, as shown on the left in Figure 1a, along with its joystick, which appears on the right in the same figure. The joystick consists of five potentiometers, whose values control both the arm’s joints and the gripper. The Arduino Mini Manipulator is used in two ways: storing and reproducing movements and as a robot that collaborates with humans through a machine learning application. Figure 1b shows the CAD schematic of the Arduino Mini Manipulator with its dimensions.

3.1.1. Case 1—Movement Learning and Reproduction of Movements

Educational Goals

The first use of the Arduino Manipulator is to spark students’ interest by introducing them to the concept of teaching a robot to perform repetitive movements. This approach does not require students to write any code; instead, it relies on a simple mechanism in which the operator manually rotates the joints to the desired positions. Through this process, students learn how simple, repetitive tasks can be automated by robotic systems. Unlike traditional robots that require programming by the user, this robot is already configured to memorize and reproduce movements through simple user manipulation. This approach allows students to observe the automation process in action without requiring advanced programming knowledge while fostering the idea that many everyday tasks can be automated, broadening their perspective on the possibilities of robotics and artificial intelligence.

Implementation

By adjusting the positions of the joystick, which simulates the arm, the robotic arm adopts the corresponding position that the controller has taken. The analog inputs from the Arduino microcontroller read the positions of the potentiometers, digitize them, and then convert them into appropriate commands to move the servo motors to the appropriate position. This allows the instructor to set the arm’s joints in various positions, enabling the robot to assume the desired pose each time. By pressing a button located in the middle of the structure (Figure 1), the specific pose of the robot is recorded in the microcontroller’s memory. Repeating the previous process with different poses of the robot results in a sequence of recorded positions, which can then be replayed in the order they were stored. This is achieved by pressing the same button twice consecutively within a time interval of less than half a second. Note that next to this button, there is a second one (Figure 1a) that temporarily pauses the robot’s operation.

Software

The robot software is provided by the authors as open-source software available in a public repository [33]. The code is written in the Wiring language for the Arduino microcontroller and controls the robot’s joints and movement memorization. In Figure 2, the flow chart of the script that controls the robot is provided.

3.1.2. Case 2—Robotic Manipulator as a Cobot

Educational Goals

The second use of the Arduino Mini Manipulator is to understand the concept of collaborative robots (cobots) alongside an example of machine learning. Collaborative robots are designed for direct human–robot interaction within a shared space, typically working in close proximity to humans, often without the need for extensive safety barriers [34]. Through this activity, students comprehend how a robot can act as a human partner, performing tasks in harmony with its users. Additionally, the potential integration of machine learning (ML) applications, which enable the robot to “learn”, is demonstrated. In the subsequent lesson, students, organized in groups, train an ML model using appropriate data and connect it to the robot so that it can act as their partner in a simulated task. This process allows students to actively experience human–machine collaboration, developing skills related to machine learning as they use the Teachable Machine [35] web application to train a model. By using this robot in this case, students understand the distinction between machine learning and simple memorization performed in the first case. In the Teachable Machine web application [35], the model needs numerous photos of each human pose to learn. In the first case, the robot learns a pose directly from the user via the controller just by pressing a single button. The robot merely executes the movements in a predefined order, unable to change any of these programmed positions. It is akin to placing ingredients for a beverage in predefined positions and programming a mechanical arm to perform a sequence of automatic movements to prepare it. With this operation, the robot has not learned the recipe but simply has stored the angles of its joints and repeats them at a fixed rate. For instance, if an ingredient is missing, or has moved from its predetermined position, the robot will not detect it and will continue the process. In contrast, in the second example, where the machine learning model is trained, this is no longer the case. The machine learning model learns human body poses and can recognize them regardless of whether different students assume them or whether the specific pose of one student differs slightly from another’s. With these two different uses of the robot, students understand the difference between simply entering positions and learning the positions through machine learning.

Implementation

To introduce the idea of using the robot as a collaborator, a model must first be trained to recognize the poses a human body takes. This is carried out with the Google Teachable Machine web application [35], as seen in Figure 3, where the model can be trained to learn various positions of the body. This is accomplished by providing data using a computer camera where at least 80 photos of each position are captured. After the appropriate photos are saved, training can proceed as shown in Figure 4. The application allows for the adjustment of model training epochs and the learning rate. It also enables the monitoring of the model’s performance improvement during training, as seen in Figure 5. After training, the model runs directly in the browser and can be tested as shown in Figure 6. If the accuracy is unsatisfactory, photo samples and training parameters can be modified and retraining can occur until satisfactory results are obtained. Once the training process is complete, the model can be exported. The application returns a simple HTML and JavaScript script that includes the URL where the trained model is stored, which can be saved locally. Running this script in a browser opens a simple window where the model runs live and displays predictions.
To connect the robot to the application running on the computer locally, a JavaScript script was also developed that accepts the recognition data and sends it serially via the USB port to the Arduino microcontroller that controls the robot.
The robot receives data encoded in numbers through its serial port and, depending on the number it receives, performs a predetermined movement. The predetermined movements are already specified in the script that runs on the Arduino microcontroller. If there is adequate time in the lesson, students are given the opportunity to change the positions taken by the robot by changing the angles of the arm joints. The script is appropriately commented to enable even inexperienced students to change the positions with ease.

Software

The robot’s code is also written in Wiring, as in the previous case, but the script is different and manages the robot’s joints. To enable the robot to detect poses, the user trains a pose classification model using the Teachable Machine [35] web application. This application generates a simple HTML and JavaScript script that runs in the browser, utilizing the computer’s camera to classify the user’s pose in real time. To establish communication between the web app and the robot, a JavaScript script must be executed locally, enabling a serial connection via USB. The complete setup is illustrated in Figure 7.

3.1.3. Hardware

  • Sensors:
  • Five (5) potentiometers, which serve as sensors for the desired position of each joint of the robot’s arms and gripper.
  • One on/off switch that temporarily suspends the robot’s operation.
  • One push button that records the current position of the joints in the microcontroller’s memory when pressed once and triggers the robot to repeat the recorded movements if pressed twice consecutively within one second.
  • Actuators:
  • Five (5) servo motors: four for joint movements and one for gripper operation.
  • One LED that provides visual feedback when a position is recorded in the microcontroller’s memory. The LED lights up momentarily during recording and blinks steadily when the robot is in suspension mode.
  • Power Source:
  • A DC power supply of 5 Volts and 0.6 Amperes.
  • Other components:
  • Breadboard, universal PCB with holes, 2 resistors of 1 kΩ, and 1 resistor of 22 Ω.

3.1.4. Assembly and Cost

The total cost of constructing the robot is approximately €60, making it significantly more economical than commercially available educational robots. Assembly is straightforward, requiring only basic tools such as a screwdriver and a soldering iron. Detailed assembly instructions are provided along with the 3D design files [33], ensuring easy reproduction.

3.2. Using the SelfLearn Robot

3.2.1. SelfLearn Robot

The second robot (SelfLearn robot) as seen in Figure 8 and Figure 9, demonstrates how a robot can learn to perform a task autonomously using a simple reinforcement learning algorithm. It consists of a two-degrees-of-freedom arm and can move freely on a line, thanks to its three supporting wheels. The robot includes an ultrasonic sensor that measures its distance from a fixed obstacle. Based on these measurements, it determines whether it is moving away, approaching, or remaining stationary relative to the obstacle. The robot moves its arm to random positions until it discovers the appropriate combination of movements (rotations to specific angles) through reinforcement learning, allowing it to move away from the obstacle. In this way, it gradually learns how to move forward, leaving the obstacle behind. The design of this robot minimizes arm complexity and achieves successful results within a short time period using reinforcement learning.

3.2.2. Teaching Reinforcement Learning

Educational Goals

The SelfLearn robot provides students with a tangible example of how a robot can learn and evolve its behavior using reinforcement learning (RL). It serves as an engaging tool that facilitates understanding of how such technologies can be developed and applied in real-world contexts. The robot’s adjustable arm lengths offer students the opportunity to observe learning behavior that adapts independently of the hardware configuration. The reinforcement learning method implemented is simple and does not require advanced mathematical knowledge, making it suitable for beginners. During the practical session, students interact with the robot and are encouraged to experiment freely, supported by detailed code comments that enhance accessibility. This allows students to experience a learning system that operates autonomously, unlike the pre-programmed systems used in earlier stages of the lesson.

Implementation

The SelfLearn robot is pre-programmed to perform sequences of random joint movements and to evaluate each sequence based on the resulting displacement from a fixed obstacle, measured via an ultrasonic sensor. Each of the two servo-operated joints can rotate between 0° and 180°. A complete movement cycle consists of four alternating activations: joint one, joint two, joint one again, and joint two again. After each cycle, the robot measures its distance from the obstacle. If a given movement sequence results in greater displacement than all previous attempts, the corresponding joint angles are recorded as a successful configuration. This trial-and-error process is repeated for 40 training episodes. If no meaningful displacement is observed (e.g., less than 2 cm), the training cycle restarts. After completing the training phase, the robot executes the most successful movements as a demonstration of the learned behavior. Students are invited to reflect on the process and suggest improvements to the code, with support and guidance provided by the instructor to promote optimization and deeper understanding.

3.2.3. Software

The robot’s software is available as open-source code in a public repository [33] and is written in the Wiring programming language for the Arduino platform. The robot’s behavior is governed by a simple reinforcement learning algorithm. It explores different combinations of arm movements through random trials. Movements that result in improved displacement are saved in memory, while others are discarded. This exploration–exploitation cycle continues until the robot has completed approximately 40 training episodes, a value selected based on classroom time constraints and observed convergence in performance. Upon completion, the robot reproduces ten of the most effective movement sequences to visibly demonstrate the outcomes of the learning process. The logic of the robot’s training and execution flow is illustrated in Figure 10.

3.2.4. Hardware

  • Sensors:
  • Ultrasonic sensor for measuring the robot’s distance from a fixed obstacle.
  • Actuators:
  • Two (2) servo motors for arm operation.
  • Power Source:
  • Six × 1.5 Volt AA batteries.
  • Other components:
  • Metal axles.

3.2.5. Assembly and Cost

The total cost of constructing the SelfLearn robot is approximately €30, making it significantly more economical than equivalent commercially available educational robots. Assembly is simple, requiring only basic tools such as a screwdriver and glue. Detailed assembly instructions are provided along with the 3D design files on GitHub [33], ensuring easy reproduction.

3.3. Study Design and Data Collection

This study was conducted during the academic year in a Greek lower secondary school with the participation of 20 ninth-grade students (12 girls and 8 boys, aged 14–15). The intervention consisted of two 45-min lessons focusing on artificial intelligence (AI) and machine learning (ML), incorporating the use of two physical robots and web-based machine learning platforms. The two sessions were separated by one week and delivered by a guest instructor supported by the school’s computer science teacher.
The first session was held in the classroom and included demonstrations, videos, and discussions introducing the concepts of AI, machine learning, and robotics. The second session took place in the school’s computer lab, where students were divided into pairs per computer station. Each station was equipped with an Arduino Mini Manipulator robot, a USB cable, and access to JavaScript scripts and the Arduino IDE (https://docs.arduino.cc/software/ide/ (accessed on 23 November 2024)). Students trained their own machine learning models using the Teachable Machine platform and integrated them with the robot.
Data were collected through both quantitative and qualitative means. At the end of the second lesson, students completed a five-item questionnaire using a 5-point Likert scale (1 = not at all, 5 = very much), designed to assess their interest in the lectures and topic as well as their perceived understanding of core concepts such as the difference between preprogrammed behavior and machine learning, the use of the Teachable Machine platform, and reinforcement learning.
In addition to the questionnaire, qualitative data were gathered through systematic classroom observation. During both sessions, the classroom teacher and an additional computer science instructor documented behavioral indicators of engagement and motivation. These included attentiveness, participation, willingness to collaborate, and the frequency of unsolicited questions. Particular emphasis was placed on contrasting these behaviors with those recorded during a previous implementation of the same content delivered without the use of robots. In that earlier lesson, 36 students (21 boys and 15 girls) were taught AI/ML concepts using software-only tools such as Scratch [29] and ML4Kids [28]. The same questionnaire was administered and similar observational data were collected for comparative purposes.
Analysis of the data involved descriptive statistics for the questionnaire responses (means and standard deviations), while open-ended student responses and teacher notes were examined thematically to identify trends in engagement, understanding, and attitudes toward the use of robotics in AI education.
The implementation did not require students to have prior programming or advanced mathematical knowledge. All software tools and robot control scripts were open-source and accompanied by detailed documentation. Ethical approval for the study was obtained through the school administration, and all activities were carried out in alignment with educational ethical standards. No personal data were collected.

3.4. Education Plan

The educational plan focused on delivering AI and ML concepts to ninth-grade students through an engaging and accessible two-part lesson. The first session, lasting 45 min, was held in the classroom and included multimedia presentations and live demonstrations. These demonstrations featured the Arduino Mini Manipulator robot performing pre-recorded movements to introduce concepts such as automation and collaborative robots (cobots). Key distinctions were drawn between rule-based automation and learning-based behavior, prompting class discussions on the differences between programming and machine learning.
The second 45-min session took place in the school’s computer lab. Students worked in pairs to train image classification models using Google’s Teachable Machine platform. After completing the training, they integrated their models with the robot using the provided scripts and observed the robot performing actions based on real-time pose recognition. In the final phase of the lab, students were introduced to the SelfLearn robot, which demonstrated reinforcement learning. The robot performed trial-and-error movements to maximize its distance from an obstacle, illustrating adaptive behavior without direct programming.
The structure emphasized interactivity and hands-on learning while requiring no prior coding experience. All materials, including robot designs and code, were made freely available, ensuring that students and educators could replicate the activities in a variety of school settings. The plan was designed to gradually build conceptual understanding by moving from simple automation to more advanced machine learning techniques, using robots as tangible examples to support theoretical instruction.

3.4.1. First Lesson

The lesson began with the presentation of a video showing a robot memorizing movements demonstrated by a human, without the need for computer programming [36]. This demonstration captured students’ interest and initiated a brief discussion about robotic capabilities and training methods. Following the video, the first robot—Arduino Mini Manipulator—was introduced. The teacher manually adjusted the robot’s arm to specific positions, which were recorded by pressing a button, enabling the robot to replicate the movements in sequence. It was clarified to students that this process did not involve machine learning but was a simple recording of joystick potentiometer values to simulate robotic arm operation.
Subsequently, students watched a second video [37] featuring a collaborative robot (cobot) working alongside humans. The importance of cobots and their growing role in future work environments was highlighted. An application involving the first robot and Google’s Teachable Machine [35] was demonstrated, in which a pose recognition model detected head movements and triggered corresponding movements in the robotic arm. Students were informed that in the next lesson, they would train their own models and apply them to control the robot’s actions.
To introduce the concept of reinforcement learning, a third video [38] was shown, depicting a two-month-old baby moving randomly. The teacher used this analogy to explain how both humans and machines learn by observing the outcomes of their movements. Reinforcement learning was presented as a method enabling machines to learn in a similar fashion [39]. The lesson concluded with a demonstration of the second robot, capable of learning arm movements independently to move forward. This presentation elicited strong interest from the students, who anticipated actively training models in the upcoming session.

3.4.2. Second Lesson

The second lesson was conducted in the school’s computer laboratory. Students were paired and assigned one Arduino Mini Manipulator per workstation. Each computer was equipped with internet access and the required software tools, including JavaScript and Arduino Wiring scripts. USB connections facilitated communication between the computers and robots. Students were instructed to train a machine learning model using Google’s Teachable Machine [35] platform, configuring it to recognize two distinct body postures.
During training, students explored parameters such as “epochs”, “batch size”, and “learning rate” and interpreted the model’s performance using the platform’s visual graphs. These concepts were explained in simplified terms to ensure accessibility for all participants.
Once their models were trained, students exported them and, following teacher guidance and the provided support materials, created a webpage to run their models. Additional software enabled the integration of the robot with the web-based model. With assistance from in-code comments and real-time teacher support—delivered both in-person and via projector—students configured the robotic arm to perform specific actions. After configuration, the code was uploaded to the robot and the students activated their full application.
Later in the session, students were introduced to the second robot. They received a written guide and digital script explaining the functionality of each code component. Although no direct code modifications were required, students were encouraged to experiment by restricting the range of joint angles, which improved training efficiency.
Throughout the laboratory activity, the teacher engaged with each student group to gather impressions and stimulate discussion about the real-world applications of machine learning. These conversations reinforced the connection between technical concepts and societal use cases. At the conclusion of the session, students completed a short questionnaire consisting of five Likert-scale questions, designed to assess their interest in the lesson content and their self-perceived understanding of key AI and ML concepts.

4. Results

Throughout the robot demonstrations and laboratory activities, students consistently displayed a high level of interest and engagement. Compared to a similar implementation conducted in a previous academic year—where AI and ML concepts were introduced through lectures and a Scratch-based lab activity without the use of robots—the difference in student behavior was substantial. In the earlier implementation, 36 students (21 boys and 15 girls) completed a post-lesson questionnaire. When asked whether the lectures were interesting (on a 5-point Likert scale), 2 students answered “a little”, 6 “moderately”, 12 “quite”, and 16 “very much”, with a resulting mean score of 4.17 (SD = 0.91). Similarly, regarding their overall opinion of the topic, 1 student found it “a little interesting”, 3 “moderately”, 12 “interesting”, and 20 “very interesting”, yielding a mean score of 4.42 (SD = 0.77). Teacher observations during that session recorded noticeable disengagement: one in five students lost focus during the lectures, and in half of the lab groups (6 out of 12), at least one member disengaged from the activity, necessitating teacher intervention.
In contrast, during the current lesson involving three physical robots, no student was observed losing interest at any point during either the lecture or the lab session. Some students seated at the back even expressed frustration over limited visibility, eager not to miss any aspect of the presentation. All students remained focused, including those typically disengaged. Enthusiasm was particularly evident during the hands-on portion, where students witnessed their code executed by the robots—eliciting applause in some cases. When teachers visited each group to gather impressions, student responses were unanimously positive. Notably, 12 out of 20 students requested to continue working with the robot during the break, indicating an exceptional level of motivation not observed in the previous implementation.
A post-lesson questionnaire administered to the 20 students involved in the current intervention further confirmed these observations. The mean score for interest in the lecture was 4.85 (SD = 0.36), and for interest in the topic, 4.20 (SD = 0.77), both notably high. Regarding content comprehension, the mean scores were also encouraging: 4.10 (SD = 0.85) for understanding the distinction between preprogrammed motions and machine learning, 3.95 (SD = 0.75) for comprehension of the Teachable Machine platform, and 4.00 (SD = 0.80) for understanding the basics of reinforcement learning. These findings indicate strong engagement paired with meaningful conceptual understanding.
This comparison effectively serves as an informal control study, as both implementations addressed similar learning objectives with comparable student demographics, differing only in the inclusion of educational robots. The observed increase in engagement and motivation, supported by both qualitative observation and quantitative student feedback, reinforces the added pedagogical value of the robotics-based methodology.
The robot presentations and lab activities were completed within the allocated time; however, one teaching hour proved to be barely sufficient, leaving limited opportunity for student questions and interaction with the instructor. Nonetheless, student inquiries were recorded for future lessons. Many of these questions were ethical in nature, including whether humans could be replaced by intelligent machines, whether AI development should continue, and how much autonomy should be granted to intelligent systems. Two students asked, “If we eventually succeed in making robots capable of replacing us in all jobs, what will humans do?” Such questions underscore the importance of including ethical considerations in AI education, as emphasized by several researchers [40,41,42,43,44].
During the lab session, technical questions—such as the clarification of code elements or locating functions—were addressed promptly. Teachers also visited each group to facilitate discussions on real-world applications of machine learning and to collect feedback on the lesson. Students proposed a variety of creative applications, including:
  • A Robot Trainer or Coach: Assisting users with exercise routines or sports training.
  • A Content Creator Robot: Supporting users in creating digital content, such as social media videos.
  • A Cooking Robot: Suggesting and preparing recipes based on available ingredients and user preferences.
  • A Study Assistant Robot: Helping students with homework and subject explanations.
  • An AI Fashion Assistant: Recommending outfit combinations and trends.
  • A Warehouse Robot: Managing inventory and coordinating stock replenishment.
It is worth noting that, in contrast to the earlier implementation without robots, student responses regarding potential machine learning applications were more focused and contextually appropriate. Specifically, in the previous implementation, 9 out of 36 responses to the open-ended question on ML applications were either unrelated or incorrect. In the current robotics-enhanced lesson, all 20 students provided relevant and thoughtful responses, reflecting a deeper and more accurate understanding of machine learning’s practical utility.
The robots functioned reliably throughout the sessions, with no unexpected malfunctions. Instructional materials—including scripts and explanatory notes—were clear and sufficient for supporting student tasks. When students encountered difficulties, teacher assistance was readily available, ensuring smooth and uninterrupted lesson progress.
The sequencing of the robots helped students distinguish between non-learning automation and actual machine learning processes. The first robot, which recorded and replayed joystick-defined positions, served to illustrate basic automation. In contrast, the Teachable Machine activity demonstrated supervised learning via human-generated data, and the third application introduced reinforcement learning through trial-and-error behavior. Together, these stages clarified the nature of machine learning and helped the students associate theoretical concepts with tangible real-world implementations.

5. Discussion

Although the lesson was not intended to provide an in-depth analysis of artificial intelligence, the inclusion of robots transformed students’ attitudes. Their integration into the educational process significantly enhanced attention and focus, as live demonstrations and the interactive nature of the activities maintained high levels of engagement. Moreover, the physical presence of the robots generated substantial interest, with students showing heightened attentiveness during the first presentation and increased willingness to participate during the second laboratory session, fostering a collaborative and enthusiastic learning environment.
The complexity introduced by the robots did not exceed the level required to meet the learning objectives. Their integration into the lesson did not significantly increase the time needed for preparation and execution compared to previous implementations that did not include robotics. In fact, their use facilitated a clearer and more practical understanding of AI and ML concepts while maintaining a manageable workload for both teachers and students.
Demonstrating robots functioning under realistic classroom conditions provided students with tangible and relatable examples of how modern technologies operate in practice. Students expressed appreciation for the opportunity to witness AI and ML technologies firsthand. Discussions held during the second lab session confirmed that students had not only grasped the basic mechanics of machine learning but also connected these principles to real-world applications, including process optimization, automation, and innovation in everyday life and industrial contexts.
The lesson structure was accessible to both instructors and students with no prior expertise in robotics or artificial intelligence. This approach allowed students from diverse academic backgrounds to engage with the subject matter. Those with an existing interest in technical fields showed increased enthusiasm for exploring robotics and advanced technologies—an engagement aligned with STEM fields traditionally seen as challenging.
The hardware performed reliably, without unexpected technical issues, confirming the robustness of the design. Activities were completed within the available class time; however, the limited duration of each session constrained opportunities for deeper discussion and extended exploration. It became evident that additional instructional time would be beneficial to further support student questions and broaden conceptual understanding.
Although the sample size was relatively small, and learning outcomes were not evaluated through a formal pre/post design, the triangulation of data—combining teacher observations, student questionnaire responses, and classroom engagement—offers compelling qualitative and quantitative insights. The main constraint was the limited time allocated to AI education in the current ninth-grade curriculum, which restricts the potential for more extended interventions. The findings suggest the need for increased instructional hours and curriculum reform to support AI and ML integration.
The comparison with a previous non-robotic implementation provided further validation of the method. That earlier version, taught with lectures and a Scratch-based lab, resulted in lower engagement, as evidenced by questionnaire data and classroom observations. In contrast, the current robotics-enhanced approach sustained attention, encouraged active questioning, and fostered voluntary participation—outcomes that strongly suggest the added pedagogical value of robotics in AI and ML education.
The overall student impression was highly positive, with many expressing a desire for additional lessons on similar topics. The positive reception prompted the development of an expanded lesson plan, including two hours of lectures and two hours of laboratory work. The activity was scaled up to include additional classes within the same school and shared with three other teachers. For the upcoming academic year, the methodology will be incorporated into teacher training seminars. Additionally, curriculum changes at the national level now support more extensive AI instruction through increased classroom hours and updated teaching materials. These developments enable broader and more sustainable implementation. Future iterations will include structured follow-up assessments to evaluate long-term learning and impact.

6. Conclusions

The use of robots has been demonstrated to support the understanding of complex concepts in STEM courses [44]. In the experimental lesson presented in this study, the incorporation of educational robots enhanced the learning experience without introducing obstacles or increasing the educator’s workload. Their presence captured students’ attention and provided concrete examples that motivated more active engagement compared to a previous non-robotic implementation. As a result, students were able to distinguish between simple memorization—where a robot stores and repeats joint angles—and machine learning, which involves learning from examples and data.
Student engagement and participation during the laboratory phase were particularly strong, indicating that the use of robots contributed positively to both motivation and conceptual understanding. The integration of the robots into the lesson did not pose challenges related to time management or technical complexity. On the contrary, their use facilitated the explanation of advanced topics in a manner accessible to students and teachers alike.
Moreover, the cost of the robots is minimal, and all related hardware and software components are open-source. The robot designs and the necessary instructional materials are freely available on GitHub [33], ensuring that the approach can be replicated in other educational contexts with limited resources. The only potential barrier is the 3D printing of plastic components, which may require access to suitable equipment.
Future plans include the development of more extensive lesson sequences and the systematic evaluation of learning outcomes to assess the effectiveness of the intervention. Additional instructional materials and lesson plans will be shared with other educators, accompanied by online training seminars. Expanding the duration of activities and including a wider range of practical examples will further enhance the potential of this methodology as a scalable and impactful tool for teaching AI and ML in secondary education.

Author Contributions

Conceptualization, S.V., G.K. and J.K.; methodology, S.V. and G.K.; investigation, G.K. and J.K.; resources, J.K.; writing—original draft preparation, G.K.; writing—review and editing, S.V. and J.K.; supervision, S.V.; project administration, J.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Ethics Committee of International Hellenic University, AΡ.6/01-07-2024, Monday 1 July 2024.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Luckin, R.; Holmes, W.; Griffiths, M.; Forcier, L.B. Intelligence Unleashed: An Argument for AI in Education; Pearson Education: London, UK, 2016. [Google Scholar]
  2. Ng, D.T.K.; Leung, J.K.L.; Chu, S.K.W. Conceptualizing AI literacy: An exploratory review. Comput. Educ. Artif. Intell. 2021, 2, 100011. [Google Scholar] [CrossRef]
  3. Brynjolfsson, E.; McAfee, A. The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies; W. W. Norton & Company: New York, NY, USA, 2014. [Google Scholar]
  4. Sutton, R.S.; Barto, A.G. Reinforcement Learning: An Introduction, 2nd ed.; MIT Press: Cambridge, MA, USA, 2018. [Google Scholar]
  5. Dautenhahn, K. Socially intelligent robots: Dimensions of human–robot interaction. Philos. Trans. R. Soc. B Biol. Sci. 2007, 362, 679–704. [Google Scholar] [CrossRef] [PubMed]
  6. Papert, S. Mindstorms: Children, Computers, and Powerful Ideas; Basic Books: New York, NY, USA, 1980. [Google Scholar]
  7. Chang, C.-Y.; Chen, C.-H.; Huang, Y.-M. Robots in situated learning classrooms with immediate feedback mechanisms to improve students’ learning performance. Comput. Educ. 2022, 186, 104534. [Google Scholar] [CrossRef]
  8. Mubin, O.; Stevens, C.J.; Shahid, S.; Al Mahmud, A.; Dong, J.-J. A Review of the Applicability of Robots in Education. Technol. Educ. Learn. 2013, 6, 1–10. [Google Scholar] [CrossRef]
  9. Anwar, S.; Perwez, S.K. Robotics and artificial intelligence in schools: Applications and benefits. Educ. Inf. Technol. 2020, 25, 981–1000. [Google Scholar]
  10. Sidiropoulos, D. Proceedings of the 6th Panhellenic Conference of the Museum of School Life and Education of the National Center for School Material Research and Preservation (EKEDISY); Educational Society of Greece: Athens, Greece, 2021; Volume 5, pp. 71–78. [Google Scholar]
  11. Prensky, M. Digital natives, digital immigrants. Horizon 2021, 9, 1–6. [Google Scholar]
  12. Guerreiro-Santalla, S.; Bellas, F.; Mallo, A. Introducing High School Students in Natural Interaction Through the Robobo Educational Robot. Lect. Notes Netw. Syst. 2022, 589, 500–512. [Google Scholar] [CrossRef]
  13. Schina, D.; Esteve-González, V.; Usart, M. An overview of teacher training programs in educational robotics: Characteristics, best practices and recommendations. Educ. Inf. Technol. 2021, 26, 2831–2852. [Google Scholar] [CrossRef]
  14. Dietz, G.; Chen, J.K.; Beason, J.; Tarrow, M.; Hilliard, A.; Shapiro, R.B. ARtonomous: Introducing middle school students to reinforcement learning through virtual robotics. In Proceedings of the 21st Annual ACM Interaction Design and Children Conference, New York, NY, USA, 27–30 June 2022; pp. 430–441. [Google Scholar]
  15. Mercan, G.; Selçuk, Z.V. Teaching Machine Learning Through Educational Robotics. In Effective Computer Science Education in K-12 Classrooms; Kert, S., Ed.; IGI Global Scientific Publishing: Hershey, PA, USA, 2025; pp. 293–318. [Google Scholar] [CrossRef]
  16. Cuevas, E.; Zaldivar, D.; Pérez-Cisneros, M.; Ramirez-Ortegon, M. Hands-on experiments on intelligent behaviour for mobile robots. Int. J. Electr. Eng. Educ. 2011, 48, 66–78. [Google Scholar] [CrossRef]
  17. Eguchi, A. AI-Powered Educational Robotics as a Learning Tool to Promote Artificial Intelligence and Computer Science Education. In Robotics in Education. RiE 2021; Merdan, M., Lepuschitz, W., Koppensteiner, G., Balogh, R., Obdržálek, D., Eds.; Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2022; Volume 1359. [Google Scholar] [CrossRef]
  18. Niedźwiecki, A.; Jongebloed, S.; Zhan, Y.; Kümpel, M.; Syrbe, J.; Beetz, M. Cloud-Based Digital Twin for Cognitive Robotics. In Proceedings of the 2024 IEEE Global Engineering Education Conference (EDUCON), Kos Island, Greece, 8–11 May 2024; IEEE: New York, NY, USA, 2024; pp. 1–5. [Google Scholar]
  19. Kintonova, A.; Suleimenova, B.; Shangytbayeva, A.K. Artificial intelligence in education. Yessenov Sci. J. 2024, 48, 303–309. [Google Scholar] [CrossRef]
  20. Georgiev, G.; Hristov, G.; Zahariev, P.; Kinaneva, D. Robotics in Education: A Comparative analysis of robotic platforms across educational levels. In Proceedings of the 2023 31st National Conference with International Participation (TELECOM), Sofia, Bulgaria, 16–17 November 2023; pp. 1–4. [Google Scholar] [CrossRef]
  21. Wang, H.; Luo, N.; Zhou, T.; Yang, S. Physical Robots in Education: A Systematic Review Based on the Technological Pedagogical Content Knowledge Framework. Sustainability 2024, 16, 4987. [Google Scholar] [CrossRef]
  22. Ouyang, F.; Xu, W. The effects of educational robotics in STEM education: A multilevel meta-analysis. Int. J. STEM Educ. 2024, 11, 7. [Google Scholar] [CrossRef]
  23. Tselegkaridis, S.; Sapounidis, T. Simulators in Educational Robotics: A Review. Educ. Sci. 2021, 11, 11. [Google Scholar] [CrossRef]
  24. Breazeal, C. Toward sociable robots. Robot. Auton. Syst. 2003, 42, 167–175. [Google Scholar] [CrossRef]
  25. Pu, I.; Nguyen, G.; Alsultan, L.; Picard, R.; Breazeal, C.; Alghowinem, S. A HeARTfelt Robot: Social Robot-Driven Deep Emotional Art Reflection with Children. arXiv 2024, arXiv:2409.10710. [Google Scholar] [CrossRef]
  26. Belpaeme, T.; Kennedy, J.; Ramachandran, A.; Scassellati, B.; Tanaka, F. Social robots for education: A review. Sci. Robot. 2018, 3, eaat5954. [Google Scholar] [CrossRef]
  27. Tanaka, F.; Cicourel, A.; Movellan, J.R. Socialization between toddlers and robots at an early childhood education center. Proc. Natl. Acad. Sci. USA 2007, 104, 17954–17958. [Google Scholar] [CrossRef]
  28. Machine Learning for Kids. 2017. Available online: https://machinelearningforkids.co.uk (accessed on 3 June 2024).
  29. Scratch. Available online: https://scratch.mit.edu/projects/editor/?tutorial=getStarted (accessed on 3 January 2025).
  30. Karalekas, G.; Vologiannidis, S.; Kalomiros, J. EUROPA—A ROS-based Open Platform for Educational Robotics. In Proceedings of the 2019 10th IEEE International Conference on Intelligent Data Acquisition and Advanced Computing Systems: Technology and Applications (IDAACS), Metz, France, 18–21 September 2019; pp. 452–457. [Google Scholar] [CrossRef]
  31. Karalekas, G.; Vologiannidis, S.; Kalomiros, J. EUROPA: A Case Study for Teaching Sensors, Data Acquisition and Robotics via a ROS-Based Educational Robot. Sensors 2020, 20, 2469. [Google Scholar] [CrossRef]
  32. Karalekas, G.; Vologiannidis, S.; Kalomiros, J. Teaching Machine Learning in K–12 Using Robotics. Educ. Sci. 2023, 13, 67. [Google Scholar] [CrossRef]
  33. Available online: https://github.com/kgjeep/AI_And_ML_Teaching_With_Robots (accessed on 12 December 2024).
  34. Colgate, J.E.; Wannasuphoprasit, W.; Peshkin, M.A. Cobots: Robots for collaboration with human operators. In Proceedings of the 1996 ASME International Mechanical Engi-neering Congress and Exposition, Atlanta, GA, USA, 17–22 November 1996; American Society of Mechanical Engineers: New York, NY, USA, 1996; Volume 58, pp. 433–439. [Google Scholar]
  35. Google Teachable Machine App. Available online: https://teachablemachine.withgoogle.com/ (accessed on 5 December 2024).
  36. No Code Robot Video. Available online: https://industrialrobotics.lt/no-code-robotics/ (accessed on 5 December 2024).
  37. Collaborative Robot Video. Available online: https://www.youtube.com/watch?v=TKyEbTcFW5M (accessed on 5 December 2024).
  38. Two-Month-Old Baby Video. Available online: https://www.youtube.com/watch?v=R7kQMd8M3U4 (accessed on 5 December 2024).
  39. Frewin, K.L.; McEwen, E.; Gerson, S.A.; Bekkering, H.; Hunnius, S. What Is Going on in Babies’ Brains When They Learn to Do Something? Front. Young Minds 2019, 7, 44. [Google Scholar] [CrossRef]
  40. Ko, J.; Song, A. Youth perceptions of AI ethics: A Q methodology approach. Ethics Behav. 2024, 2024, 2396582. [Google Scholar] [CrossRef]
  41. Bellaby, R.W. The ethical problems of ‘intelligence–AI’. Int. Aff. 2024, 100, 2525–2542. [Google Scholar] [CrossRef]
  42. Burton, E.; Goldsmith, J.; Koenig, S.; Kuipers, B.; Mattei, N.; Walsh, T. Ethical Considerations in Artificial Intelligence Courses. AI Mag. 2017, 38, 22–34. [Google Scholar] [CrossRef]
  43. Vu, T. Combating the Machine Ethics Crisis: An Educational Approach. arXiv 2020, arXiv:2004.00817. [Google Scholar]
  44. Čelarević, A.; Mrakić, I. Application of educational robots in teaching. Nauka I Tehnol. 2023, 11, 8–16. [Google Scholar] [CrossRef]
Figure 1. (a) Arduino Mini Manipulator. (b) Technical CAD View of the Mini Robot Manipulator. The manipulator consists of 3D-printed components and servo-driven joints, with overall dimensions of approximately 75 mm width, 71 mm depth, and 233 mm height.
Figure 1. (a) Arduino Mini Manipulator. (b) Technical CAD View of the Mini Robot Manipulator. The manipulator consists of 3D-printed components and servo-driven joints, with overall dimensions of approximately 75 mm width, 71 mm depth, and 233 mm height.
Applsci 15 04570 g001
Figure 2. Arduino Mini Manipulator flow chart.
Figure 2. Arduino Mini Manipulator flow chart.
Applsci 15 04570 g002
Figure 3. Teachable Machine web application.
Figure 3. Teachable Machine web application.
Applsci 15 04570 g003
Figure 4. Training of the model.
Figure 4. Training of the model.
Applsci 15 04570 g004
Figure 5. Training performance of the model.
Figure 5. Training performance of the model.
Applsci 15 04570 g005
Figure 6. Testing the model.
Figure 6. Testing the model.
Applsci 15 04570 g006
Figure 7. The student connects to the web application through a browser, where they train the machine learning model. The application returns an HTML file containing the trained model’s URL. The connection between the browser app and the robot is made with JavaScript running on a local computer.
Figure 7. The student connects to the web application through a browser, where they train the machine learning model. The application returns an HTML file containing the trained model’s URL. The connection between the browser app and the robot is made with JavaScript running on a local computer.
Applsci 15 04570 g007
Figure 8. SelfLearn robot side view.
Figure 8. SelfLearn robot side view.
Applsci 15 04570 g008
Figure 9. SelfLearn Robot CAD Design.
Figure 9. SelfLearn Robot CAD Design.
Applsci 15 04570 g009
Figure 10. SelfLearn logic diagram.
Figure 10. SelfLearn logic diagram.
Applsci 15 04570 g010
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Karalekas, G.; Vologiannidis, S.; Kalomiros, J. Teaching Artificial Intelligence and Machine Learning in Secondary Education: A Robotics-Based Approach. Appl. Sci. 2025, 15, 4570. https://doi.org/10.3390/app15084570

AMA Style

Karalekas G, Vologiannidis S, Kalomiros J. Teaching Artificial Intelligence and Machine Learning in Secondary Education: A Robotics-Based Approach. Applied Sciences. 2025; 15(8):4570. https://doi.org/10.3390/app15084570

Chicago/Turabian Style

Karalekas, Georgios, Stavros Vologiannidis, and John Kalomiros. 2025. "Teaching Artificial Intelligence and Machine Learning in Secondary Education: A Robotics-Based Approach" Applied Sciences 15, no. 8: 4570. https://doi.org/10.3390/app15084570

APA Style

Karalekas, G., Vologiannidis, S., & Kalomiros, J. (2025). Teaching Artificial Intelligence and Machine Learning in Secondary Education: A Robotics-Based Approach. Applied Sciences, 15(8), 4570. https://doi.org/10.3390/app15084570

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop