Next Article in Journal
An Android Malware Detection Method Using Frequent Graph Convolutional Neural Networks
Previous Article in Journal
MSG-YOLO: A Multi-Scale Dynamically Enhanced Network for the Real-Time Detection of Small Impurities in Large-Volume Parenterals
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design and Implementation of a Tripod Robot Control System Using Hand Kinematics and a Sensory Glove

Department of Mechatronics, Silesian University of Technology, 44-100 Gliwice, Poland
*
Author to whom correspondence should be addressed.
Electronics 2025, 14(6), 1150; https://doi.org/10.3390/electronics14061150
Submission received: 22 January 2025 / Revised: 12 March 2025 / Accepted: 12 March 2025 / Published: 14 March 2025
(This article belongs to the Topic Electronic Communications, IOT and Big Data, 2nd Volume)

Abstract

:
Current technological progress in automation and robotics allows human kinematics to be used to control any device. As part of this study, a sensory glove was developed that allows for a delta robot to be controlled using hand movements. The process of controlling an actuator can often be problematic due to its complexity. The proposed system solves this problem using human–machine interactions. The sensory glove allows for easy control of the robot by detecting the rotation of the hand and pressing the control buttons. Conventional buttons have been replaced with SMART materials such as conductive thread and conductive fabric. The ESP32 microcontroller placed on the control glove collects data read from the MPU6050 sensor. It also facilitates wireless communication with the Raspberry Pi microcontroller supporting the Modbus TCP/IP protocol, which controls the robot’s movement. Due to the noise of the data read from the gyroscope, the signals were subjected to a filtering process using basic recursive filters and an advanced algorithm with Kalman filters.

1. Introduction

Technological progress is particularly noticeable in the field of industrial automation and robotics [1]. Modern factories increasingly utilize robots that are capable of performing specific tasks assigned by operators [2]. The introduction of such machines not only increases work efficiency but also improves worker safety by reducing their exposure to hazardous conditions [3]. Robotics research opens up new possibilities for precise task execution, especially where repeatability and error elimination are critical. However, integrating robots into industrial processes presents challenges, including difficulties in implementing control systems and ensuring effective collaboration with human operators [2,4]. Contemporary control systems should primarily be intuitive for users. Intuitiveness means that an operator can easily understand and utilize all system functions without extensive training. Therefore, more and more control systems are being designed to utilize signals from the human body, such as eye or facial movement tracking [5]. These technologies have the potential to revolutionize human–machine interactions, making this collaboration more natural and efficient. For instance, eye-tracking can be used to control a machine without using hands, which is particularly useful in environments requiring multitasking.
Another innovative approach involves utilizing human limb movements [6]. Hand gestures, described through appropriate mathematical transformations and supported by advanced sensors, offer an intuitive means of controlling machines, simplifying the control process and minimizing errors [7]. The development of sensory technology, driven by the need for the precise reading of signals from the human body, has enabled the creation of modern sensors that are capable of rapid data processing [8]. An example of this progress is hand motion control, which can be achieved using image processing technology [9] or intelligent sensory gloves [10]. Such approaches are particularly important in applications requiring precision and safety, such as work in difficult or hazardous environments [9,11]. For example, in nuclear or chemical industries, gestures can replace traditional controllers, enabling safer operations in high-risk areas. The relationship between the development of robotics, automation, and control theory is a frequent topic of scientific research. A proper control system requires a comprehensive measurement system capable of tracking the Cartesian coordinates and angular positions of individual robot components [12]. Mathematical models of robot dynamics present solutions to issues related to stability and disturbances, while the development of neural networks has a significant impact on the compensation of these disturbances [13]. Furthermore, the dynamic advancement of control theory results from increasingly sophisticated systems and the need to address new problems [14].
In the context of consumer and domestic robots, their ability to collaborate effectively with humans will be a crucial element [15]. Such systems must not only acquire and process information in real time but also ensure reliable and fast data transmission without losses [16]. A control system should be open and flexible to allow for easy adaptation and integration with other technologies. Utilizing a dual-processor configuration enhances these qualities by providing a hierarchical and modular architecture, where each layer has a clearly defined purpose. This approach improves the system’s efficiency and organization [4]. At the same time, controlling executive devices using traditional systems requires appropriate qualifications and skills [17]. Meanwhile, research on using human gestures, particularly hand and arm movements, offers new possibilities for creating intuitive control systems based on the analogy of machine movements to upper-limb movements [18].
Progress in the field of robotics is at a high level. However, there are still some aspects that are still being developed. The problem of the control process using gestures pertains to the need to properly read the specific movements of the operator, such as the grip of various elements, hand dynamics or force, and pressure during contact. The research shows how important it is to design and analyze kinematics along with the appropriate testing of the kinematic model. The obtained and verified model allows for the derivation of appropriate formulas describing both the kinematics and dynamics of the analyzed object [19].
This study presents the development of an intuitive control system for a delta robot based on a sensory glove. The proposed solution integrates advanced motion capture and wireless communication technologies to enable precise and natural robot manipulation. The main contribution of this research lies in the implementation of a real-time control system that enhances user experience and reduces the complexity of traditional robot operations.
Section 2 presents a classification, whereby the types of robots used in industry are described and the choice of a delta robot as an object to control is justified. Section 3 deals with the control glove and the specifications of the control system, describing the construction of the control glove and the hardware layer of the project. Section 4 addresses signal filtering, describing the methods used to filter the signal from the sensor module. The methods are compared in this section and the most optimal filtration method is selected. Section 5 deals with control system implementation and demonstration, showing a real delta robot with the applied control system. Section 6 offers the conclusions from the project implementation process and subsequent stages of project development.

2. Classification of Industrial Robots and Control Systems

The following classification of industrial robots is based on their movement capabilities and structural design [20]. This classification (Figure 1) provides essential context for understanding the various types of industrial robots and how their structural characteristics influence their functionality and control. By differentiating between robot types such as Cartesian, cylindrical, and delta robots, it becomes easier to assess their respective advantages and limitations. This understanding aids in selecting appropriate control strategies and kinematic approaches, ensuring optimal performance for specific tasks. Ultimately, this classification serves as a foundation for analyzing how a robot’s structure affects its movement precision, efficiency, and suitability for particular industrial applications.
Robots that are often used have a serial kinematic structure, resembling a human hand. UR6 Universal Robots, such as the UR3e model, are one example of this type; they are lightweight, compact, and designed for tasks requiring precision. These solutions are used in many industries where their flexibility and ease of programming are key. However, such industrial robots are not suitable for all types of technological processes. Some of these processes take place very quickly and require the appropriate stiffness of the mechanism under varying load forces [21]. In these cases, delta robots are often used; they are primarily employed in the food, pharmaceutical, and electronics industries. Delta robots, such as the ABB FlexPicker, are commonly used for sorting, packaging, and assembling small components. Their design also allows them to work in environments with high hygienic requirements, such as on food processing production lines. It is also worth mentioning SCARA robots (Selective Compliance Assembly Robot Arm), such as the Yamaha YK400, which are used for tasks requiring fast and precise movements in the horizontal plane. SCARA robots perform well in electronic component assembly and processes requiring high-level repeatability. Another interesting example is Cartesian robots, such as the Bosch Cartesian Robots line, which offer simple construction and high accuracy. They are used in applications such as machining, gluing, or 3D printing, where stability and ease of programming motion in three axes are important. Despite their advantages, delta robots have certain limitations, such as a relatively small working range and limited ability to handle heavier loads. However, they are irreplaceable in applications where precision and speed are crucial. Thanks to these features, delta robots are often chosen where performance matters, such as in the automation of highly repetitive processes. The comparison of the presented robots is presented in Table 1.
The classification of industrial robots, particularly Cartesian, cylindrical, and delta robots, provides a foundational context for understanding the application of the proposed control system. While anthropomorphic robots, which resemble the human upper limb, are often considered the most intuitive in terms of design, the proposed control system is not limited to these types. The flexibility of its signal processing and control methods allows for adaptation across various robot types, with a particular focus on delta robots in this study. Delta robots are especially well-suited for tasks requiring high speed and precision, making them ideal candidates for demonstrating the effectiveness of the proposed control system. Their unique characteristics, such as exceptional performance in repetitive motion applications, create an optimal environment for evaluating and implementing the intuitive control system using a sensory glove. This relationship underscores the broad applicability of the proposed system and highlights its potential to enhance the efficiency and interactions of various robot types beyond traditional anthropomorphic designs.
The delta robot comprises a base, arms, and a triangular platform—presented in Figure 2. The base serves as the primary structural component, providing stability and support for the entire mechanism. It houses the drive units responsible for controlling the movement of the robot’s arms. Constructed from durable materials such as steel or aluminum, the base ensures structural rigidity while minimizing vibrations during operation.
The robot’s arms, consisting of three two-section components, play a critical role in its functionality. Each arm features two segments connected by joints, enabling smooth and precise motion transfer from the drive units to the work platform. The use of lightweight materials, such as carbon fiber or light metal alloys, enhances the robot’s speed and precision. This optimized design allows for high acceleration and short cycle times, which are essential for efficient operations.
The delta robot operates based on parallel kinematics, where three independent actuators control the movement of each arm. By adjusting the angles of the arms, the end-effector is positioned with high accuracy. A standard Delta robot typically has three translational degrees of freedom (X, Y, Z) and, in some cases, an additional rotational DOF for end-effector orientation. This configuration allows the robot to move the end-effector precisely in three-dimensional space without changing its tilt. Thanks to its parallel structure, the delta robot achieves high-speed movement, low inertia, and excellent precision, making it ideal for tasks requiring rapid and precise manipulation. The driving system of the delta robot plays a crucial role in its operation, ensuring the accurate and dynamic movement of the arms. Typically, the system consists of three high-speed servo motors or stepper motors, each controlling one arm. These motors are mounted at the base and drive the arms through a system of linkages and joints.
The motors receive signals from the control system, which processes input data and translates it into precise angular displacements. This allows the delta robot to achieve fast and smooth movements while maintaining high accuracy. Advanced motion control algorithms, such as PID controllers or model predictive control, further enhance the precision and responsiveness of the driving system, making it suitable for applications requiring high-speed and high-precision operations. The relationship between the driving system and the control system is essential for the efficient operation of the delta robot. The control system, which is typically implemented on a microcontroller or embedded platform, continuously processes input data from sensors, such as encoders, and computes the necessary motor commands. The accuracy of this control system directly affects the precision of the driving system, ensuring that the motors respond correctly to the required position and velocity commands. Moreover, real-time feedback loops allow the system to adjust movements dynamically, compensating for external disturbances and ensuring stable operations. The integration of advanced algorithms, such as inverse kinematics and trajectory planning, further optimizes the coordination between the driving system and control logic, enabling smooth and precise motion control even in complex tasks. Located below the arms, the triangular platform is the final element of the delta robot. This platform serves as the attachment point for grippers, work tools, or other devices, depending on the robot’s specific application.
Thanks to its articulated connection with the arms, the platform moves synchronously with them, ensuring precise and coordinated motion. The triangular shape of the platform ensures even force distribution, providing stability even during high-speed operations or dynamic changes in movement direction. These three elements—the base, arms, and triangular platform—work together to provide the delta robot with high precision, speed, and stability during operation.
There are many methods of controlling a delta robot. The classical methods of controlling a delta robot refer to traditional approaches based on established control theories and techniques. One of the most commonly used classical methods is PID control, which adjusts control signals based on the error, the integral of the error, and the rate of change of the error [23]. Alternative methods of controlling delta robots are becoming more widespread, utilizing modern technologies and computational power. Machine learning and artificial intelligence (AI) are used to optimize the robot’s control system, enabling it to adapt to various working conditions and improve its performance over time. One of the alternative methods is an algorithm that has only one root. This is made possible using analytical justification and numerical calculations [24]. Another alternative method uses ANFIS (Adaptive Neuro Fuzzy Inference System) to solve the inverse kinematics problem. Research [24] shows that this is possible when using a five-layer neural network to accurately predict the final position. Using ANFIS allows for fast and acceptable solutions to the inverse kinematics problem of a delta robot, as confirmed during simulations using VR [25]. Traditional methods require providing specific values of the rotations of individual drives in the case of simple kinematics, or providing a position in the Cartesian coordinate system for inverse kinematics. The final position of the delta robot platform depends on the construction of the robot, mainly on the length of the arms [26].
Therefore, an attempt was made to create a control system using hand movements in space. This system allows the operator to easily control the robot. Standard robot control methods often require the operator to know the appropriate programming languages for industrial robots. The created control system is easy to use for every operator, regardless of their level of technical knowledge.

3. Control Glove and Specifications of the Control System

For this purpose, it is necessary to make a control glove that allows appropriate signals to be sent to the robot. Then, the data should be received using a microcontroller and sent to the robot using an appropriate communication protocol. An analysis of the quality of the filters used is carried out.
The control system concept consists of the following parts (Figure 3):
The system consists of several modules, including a glove with sensors and a microcontroller responsible for archiving, processing, and sending data, as well as a power system for the glove (represented in Figure 3 by the block labeled “Powerbank”), a microcontroller that receives signals from the glove, and a tripod robot controller.
The glove’s task is to read data from the sensor module, archive, process, and filter them, then transmit the data via wireless communication. The glove also includes a power system placed on the operator’s arm to alleviate strain on the hand. The microcontroller receiving signals from the glove supervises both wireless and wired communication. It processes the received data into a format that aligns with the robot controller’s communication protocol and sends a data frame to the robot controller. The robot controller is responsible for receiving the control signals through the appropriate communication protocol, processing the data into corresponding control signals for each axis, and managing additional components.
In order to reduce losses and increase hand mobility, conductive thread and conductive fabric were used to make electrical connections. The glove concept is shown in Figure 4.
The microcontroller, together with the sensor module and the display, is placed on the upper part of the control glove.
In the first stage of component selection, it was necessary to select an appropriate microcontroller that would allow data to be read from the sensor module and sent via wireless communication. The microcontroller should meet the following requirements (presented in Figure 5):
After analyzing the available microcontrollers, the ESP32 module was chosen due to its superior performance and integrated communication features. Compared to other commonly used microcontrollers such as the ATmega328P or ESP8266, the ESP32 offers significantly greater computing power and memory. While it is physically larger than some alternatives, it has built-in Bluetooth and WiFi modules, eliminating the need for additional communication components. Additionally, its higher processing power allows for faster and more frequent data acquisition from the sensor module, improving measurement accuracy. Unlike the ESP8266, the ESP32 supports multithreading, enabling the simultaneous execution of multiple tasks, which further enhances its efficiency in complex applications. Location and orientation can be determined using inertial systems, which include an accelerometer, gyroscope, and magnetometer. These three sensors, when combined, provide comprehensive data on the device’s movement and orientation in three-dimensional space. This is possible because each sensor measures different aspects of motion: the accelerometer detects linear acceleration, the gyroscope measures angular velocity, and the magnetometer detects the device’s orientation relative to Earth’s magnetic field. By integrating data from these sensors, advanced sensor fusion algorithms can offer more accurate and reliable information about the device’s position, speed, and orientation. One such algorithm is the Kalman filter, which is frequently used to combine accelerometer, gyroscope, and magnetometer data [28]. This method improves the precision of orientation and position measurements by correcting for sensor noise and errors, thereby delivering a more accurate estimation of the device’s state.
A common task for robot control systems is real-time positioning. For this reason, robots are often equipped with inertial measurement units (IMUs) using microelectromechanical system (MEMS) technology. In these units, we find an accelerometer and a gyroscope built in one module, which are the basis for measuring linear and angular acceleration as well as linear and angular velocity [26]. Its general availability and small size determined the use of the MPU6050/GY-521 microelectronic system [28,29]. The built-in DMP (digital motion processor) hardware unit fuses data from all sensors, which allows for the determination of a specific position relative to the Earth. The undoubted advantage of this solution is that it significantly reduces the load on the microcontroller. It is possible to program the DMP unit in such a way that it also uses an external magnetometer for its calculations. Using the DMP unit, we can eliminate the phenomenon of gyroscopic drift that appears when changes are observed slowly [30]. The use of a 16-bit converter significantly increases the measurement accuracy when using the MPU6050 module. Thanks to the use of the I2C protocol, the system is easy to use and does not require additional peripheral devices.
This module has the following specifications (presented in Table 2):
The small dimensions of the module allow it to be easily placed on the PCB.
Another issue is the selection of an appropriate microcontroller that allows it to receive signals from the control glove, process them appropriately, and send them to the robot controller. Given the presence of the built-in robot controller, the microcontroller should primarily support Wi-Fi communication, be compatible with the Modbus TCP/IP protocol, and include an Ethernet port. The Modbus TCP/IP protocol is based on the ETHERNET TCP/IP communication standard. This is a version of the Modbus protocol that uses TCP for communication, operating on port 502. Unlike the Modbus RTU version, it does not include checksum calculations because this function is already performed by the higher TCP layer. The ID field is not always used because the IP address, used in the TCP/IP protocol, already serves as the device identifier [32]. In transmission in a Modbus TCP network, it is necessary to distinguish between master and slave devices, and communication takes place using TCP/IP packets. The transmit ID field is used to identify which host query corresponds to a given response. The message size field allows the transmitted data to be divided into TCP/IP packets, ensuring smooth transmission. The device identifier field makes it easier to address network terminals supporting the classic version of Modbus by using appropriate gates [33]. The data frame is shown in Figure 6:
The increasing use of the TCP/IP protocol across various industries has led more manufacturers of industrial automation systems to implement this standard in their devices. Typically, solutions based on this protocol integrate proven communication mechanisms used in industrial networks with the functionalities of the TCP/IP protocol. A frequently used practice is the encapsulation of industrial network frames in TCP segments, which allows for the effective use of the capabilities of this communication standard [34]. The Modbus TCP protocol uses the Ethernet TCP/IP standard, which enables easy and convenient connection between devices with a TCP/IP interface. This allows users to communicate effectively between different devices on the network, which increases the flexibility and interoperability of systems using this protocol [35]. Additionally, Modbus TCP eliminates the need for complex conversion processes between different communication standards, simplifying system architecture and reducing latency. Another advantage of this protocol is its scalability, allowing the seamless integration of new devices without significant reconfiguration efforts. These features make Modbus TCP a preferred choice in many industrial automation applications, where reliability and ease of use are crucial [36].
When selecting microcontrollers, it is necessary to pay attention to the support of appropriate communication protocols. Comparing the microcontrollers available on the market such as Arduino Uno, ESP32 and Raspberry Pi, we chose the last of the mentioned platforms, due to the built-in support of the Modbus protocol without the need to use additional libraries, and because of the presence of an Ethernet port.
Raspberry Pi 3B+ has basic communication interfaces often found in everyday computers. In addition to audio and video connections, the user has at their disposal the following features—presented in Table 3.
The real-time performance of the control system is achieved through optimized hardware and software components. The ESP32 microcontroller enables efficient wireless communication with low latency, while the MPU6050 sensor provides real-time motion data, which is processed using filtering algorithms to enhance accuracy. The Raspberry Pi 3B+ functions as the central processing unit, managing data acquisition and control tasks with sufficient computational power to ensure smooth operation. This integrated hardware setup allows the system to respond quickly and reliably, making it well-suited for precise robotic control applications.
The components used along with their functions are presented in the Table 4.
In the next step, a program was developed to enable control using the control glove. The block diagram of the control system is presented in Figure 7. The program consists of several key components, beginning with process initialization. During this stage, the inputs and outputs of the microcontroller on the control glove PCB are configured. Additionally, the Wi-Fi connection is established by assigning the IP address of the target device and setting a password to facilitate communication between the two microcontrollers.
Furthermore, the I2C protocol is initialized within the microprocessor system to enable data retrieval from the sensor module. At this stage, the MPU6050 module is also calibrated. To enhance system performance, the entire initialization process should be conducted with minimal hand movement, reducing potential interference.
Next, the program moves on to downloading data from the MPU6050. At this stage, the data are read from the sensor module. The electrical signals received from the sensor appear in the form of a data frame, and the ESP32 module only needs to receive these data and store them in a buffer. Following this, the program enters the data processing phase, where the data frames received from the sensor module are processed. To ensure proper processing, an appropriate library is used to convert the received signals into clear indications from the gyroscope and accelerometer. At this stage, the signal is also filtered. The next step involves checking the status of the microcontroller inputs. Here, it is verified whether any of the buttons (made of conductive fabric) on the fingers are activated. This information is then saved in the microcontroller.
Finally, if an enabling input is active, the data is post-processed, and the appropriate data frame is created to send the information to the microcontroller that intervenes between the control glove and the delta robot. The relationship between the control system of the delta robot mechanism and the sensory glove is crucial for ensuring smooth and responsive operations. The control glove acts as the primary interface for human interaction, allowing the intuitive and precise manipulation of the robot. The sensory glove captures hand movements and button presses, which are processed and converted into control signals. These signals are then transmitted wirelessly to the intermediary microcontroller, which interprets them and generates appropriate commands for the delta robot’s driving system. This integration ensures that the user’s hand movements directly influence the positioning of the robot’s end-effector, enabling real-time and highly accurate control. By combining data from the glove’s motion sensors and tactile inputs, the system achieves a natural and efficient method of robot operation, enhancing usability and responsiveness.
Each button (Figure 4) has its own unique function (presented in Table 5) that allows the user to control the delta robot. Data from the buttons are sent to the robot controller and trigger the appropriate function.
The sensor module enables rotation measurement along three axes. In the full-control-system mode, data from all three axes are captured simultaneously, providing the user with comprehensive information about hand positioning in 3D space. This mode allows the robot’s TCP (tool center point) to move along multiple axes at once. Manual mode, on the other hand, restricts control to a single axis at a time. This functionality is particularly useful for precise TCP positioning, as the full-control mode may introduce unintentional rotation along unintended axes. Therefore, the manual mode is essential for fine-tuning the TCP position with high accuracy.
When reading data from the sensor module and buttons, it is necessary to create an appropriate data frame that is sent to the microcontroller intervening between the control glove and the robot controller. The data received from the sensor module include information about rotation in the x, y, and z axes in degrees, each represented as a float type variable. Additionally, information about the button states is encoded in an int variable consisting of four digits, which reflects the current state of the system. This allows the user to choose between the manual mode, in which each axis is controlled separately, or the full-control mode, in which all axes are controlled simultaneously. It also enables the activation of the pneumatic suction cup.
Information about the button status consists of four digits, as shown on Figure 8.
The prepared programs were implemented using the microcontroller. Then, a PCB was made according to the diagram—presented in Figure 9.
The completed control glove is presented in Figure 10.

4. Experimental Results and Signal Filtering

This section describes the filtration process that allows a user to remove the measurement noise present in measurements related to the position of the hand. Control using hand movement requires, first of all, the acquisition of precise output signals. Therefore, an appropriate filtration process of measurements obtained from the sensor module is necessary. The project analyzed the possibilities of using both basic recursive and more advanced filters, such as the commonly used Kalman filter. These filters differ in their operating principles, and we verified whether recursive filters are sufficient for the needs of the project.

4.1. Theoretical Basis of Filtration

Data read from sensors, such as a gyroscope or accelerometer, often contain noise, which can lead to inaccuracies in readings and, consequently, measurement errors. The MPU6050 sensor allows data to be read from both the accelerometer and gyroscope. These data were imported into MATLAB R2023a for further analysis.
First, an analysis was performed on the unfiltered signals to observe how the presence of noise affects the quality of the readings. The results showed significant variation in the data over short time intervals, which could lead to difficulties in further processing and analysis. Various filtering methods were then applied to reduce these disturbances.
The basic element in filtering systems is a filter, which reduces unwanted components of the input signal through appropriate attenuation. Processing occurs in the time domain, which simultaneously alters the spectrum of the original signal, i.e., in the frequency domain. Filtering can be applied to both analog and digital signals using appropriate types of filters. The goal of signal filtering is to remove interference from the input signal and extract useful information, while also transforming the signal into another form, such as through differentiation filters, Hilbert filters, or filter banks for signal decomposition.

4.2. Recursive Filters

Recursive dependencies in filtering are primarily used in systems that handle large amounts of data, as they do not require all of the data to be stored in memory (on the stack). Recursive methods only require knowledge of the previous average value, additional current data, and an understanding of the total number of data points. The first filter tested is the averaging filter. A good way to eliminate measurement noise is to average the measurement values, although, in this way, we also eliminate information about the dynamics of the object—only its average value remains.
The set of values obtained from the gyroscope is presented as ( x 1 ,   x 2 ,   x 3 , ,   x k ) and consists of k elements. The average value is:
x k ¯ = x 1 + x 2 + x 3 + + x k   k
The above relationship can be transformed into a recursive relationship.
x k ¯ = α x k 1 ¯ + ( 1 α ) x k
where α = 1/k.
The result of using the averaging filter is shown in Figure 11.
This averaging filter is typically suitable for various measurements, effectively reducing fast-changing noise in a small amplitude range. However, in the case of measuring rotation speed, which can vary over a wide range, the filter does not provide the desired results. While it works well for eliminating noise in measurements with slower variations, it loses important information when the rotation speed changes rapidly, an effect that is particularly noticeable up to t = 0.5 s.
A good way to eliminate measurement noise is to average the measurement values; however, this also eliminates information about the object’s dynamics, leaving only its average value. To reduce noise while preserving dynamic properties, an averaging filter with a moving window can be used. The moving average is not calculated from all measurements, but only from a certain set of recently recorded values. Previous data are discarded when new ones are obtained, and only the most recent measurements are used to determine the average. The moving average of the last n elements is determined as:
x k ¯ = x k n + 1 + x k n + 2 + x k n + 3 + + x k n
The above relationship can be transformed into a recursive relationship:
x k ¯ = x k 1 ¯ + x k x k n n
The result of using the moving average filter is shown in Figure 12.
Comparing the operation with the averaging filter, its value is revealed to be a compromise between two opposing goals: delay and the filtration of measurement noise. If the delay is too large, the value of window n should be reduced, but this will result in a deterioration of the quality of measurement noise filtration.

4.3. Advanced Filtration Methods

The Kalman filter is a state observer that allows for the minimization of the mean squared estimation error, which means that the algorithm allows for the estimation of the internal state based on input and output measurements. The state estimate is statistically optimal. The Kalman filter can work in both linear and nonlinear systems (it is then called the extended Kalman filter). The filtration process is presented using a discrete-time state-space model:
x t + 1 = A x t + B u t + v ( t )
y t = C x t + w ( t )
In the given system, x(t) represents the state of the system at a specific time t, where t takes discrete values, such as t = 0, 1, 2, … The state vector x(t) contains the internal variables that describe the current condition or configuration of the system. Meanwhile, y(t) denotes the system’s output at time t, which is typically a function of the state and input of the system.
The system’s dynamics are governed by a set of matrices: A, B, and C. The matrix A, known as the state matrix, determines how the state evolves over time based on its current values. It describes the relationship between the system’s current state and the next state. The matrix B, the input matrix, defines how the input to the system influences the state. The input vector, often denoted as u(t), is multiplied by this matrix to determine its effect on the system’s state. The matrix C, the output matrix, governs the relationship between the state and the system’s output. It maps the state to the output, indicating how the internal system variables contribute to the observable quantities.
In summary, the system is defined by its states, outputs, and the way inputs affect both, with the matrices A, B, and C capturing the relationships that govern the system’s behavior over time.
The symbols v(t) and w(t) denote process noise (generating noise) and measurement noise. These are independent mutual realizations of white Gaussian noise with zero expected value and known covariance matrices V and W. Thanks to them, it is possible to reproduce the imperfections of the adopted model and, in other applications, the inaccuracies of the measurement equipment.
The result of using the Kalman filter is shown in Figure 13.

4.4. Filtration Results

The Kalman filter is a more advanced filtering method, which is particularly effective in conditions where measurement noise and process disturbances are significant. It is an optimal algorithm that estimates the internal state based on both input data and output measurements. Thanks to its structure, the Kalman filter provides the most accurate results by eliminating both noise and inaccuracies in the model. The Kalman filter is statistically optimal, meaning it minimizes the mean squared error in state estimation.
Compared to averaging filters, the Kalman filter is more precise and adapts to changing conditions, considering both system dynamics and measurement noise. Its use in nonlinear systems (in the extended Kalman filter version) allows for even wider applications in more complex systems. Its advanced structure allows for better modeling of dynamic systems compared to simpler filtering methods, such as averaging and moving average filters. After analyzing the filters based on the hand movement indications, the filtration results are summarized in the Table 6.

5. Control System Implementation and Demonstration

After preparing the filtration system and constructing the glove, the system was used to control a real delta robot, which serves as a demonstration model. Its primary task is to precisely transport a ball, showcasing the capabilities of the technology in motion control. Thanks to the glove, the robot can smoothly replicate the operator’s hand movements, performing the task dynamically and accurately.
Additionally, this solution enables the execution of other tasks, such as precisely arranging objects in specific configurations, manipulating items in constrained spaces, or performing actions that require high accuracy, such as assembling small components. The advantage of this approach is its ability to intuitively control the robot in a natural way, which enhances efficiency in various industrial and educational applications, as well as allowing for quick adaptation to different manipulation tasks.
The control system developed in this project (presented in Figure 14) is a versatile solution that can be adapted to various types of robots, including tripod robots. Although this study focuses on a tripod robot, the control system is designed to accommodate multiple robotic configurations, such as delta robots, tripod robots, and other industrial robots. For the tripod robot, the control system addresses specific challenges arising from its configuration, particularly the need to synchronize the movements of its three legs to ensure stability and precision in executing tasks. The control algorithms are tailored to the kinematic and dynamic characteristics of this robot type while maintaining a general operational framework. This adaptability allows the system to be easily implemented across different robotic platforms, depending on their structural design and application requirements. The flexibility of this solution makes it suitable for a wide range of applications, from educational demonstrations to industrial tasks requiring precise object manipulation in demanding environments. Adapting the control system to a tripod robot primarily involves adjusting control parameters and algorithms to meet mechanical constraints, while the fundamental principles of control and data filtering remain unchanged.

6. Conclusions

The completed project lays the groundwork for further research on the sensory glove. The integration of flexible, lightweight, and conductive polymer materials presents an opportunity to enhance comfort, durability, and sensor performance. By refining the material composition, it is possible to develop a more ergonomic and user-friendly glove, making it better suited for long-term use in various robotic applications.
Future work will focus on implementing polymer-based sensors to improve motion tracking accuracy. These sensors could enhance the detection of both static positions and dynamic forces, allowing for more precise gesture control. By fine-tuning sensitivity and flexibility, the system could respond more effectively to user input, increasing its reliability in industrial and educational robotics. Moreover, advanced filtering algorithms will be explored to further refine data processing and minimize noise in motion capture.
The long-term goal is to create a highly adaptive control system that is seamlessly integrated with different robotic platforms. Expanding compatibility with various robot configurations will increase the versatility of the system, making it more accessible to a broader range of applications. Additionally, further research into wireless communication protocols and power efficiency will help improve the glove’s overall performance, ensuring seamless real-time interaction with robotic systems.

Author Contributions

Conceptualization, J.K. and T.T.; methodology, P.K.; software, J.K.; validation, P.K., M.S. and T.T.; formal analysis, T.T.; investigation, J.K.; resources, M.S.; data curation, M.S.; writing—original draft preparation, J.K.; writing—review and editing, T.T.; supervision, P.K.; project administration, T.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Dataset available on request from the authors. The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Panchal, P.B.; Nayak, V.H. A hand gesture based transceiver system for multiple application. In Proceedings of the 2015 2nd International Conference on Electronics and Communication Systems (ICECS), Coimbatore, India, 26–27 February 2015; pp. 679–684. [Google Scholar] [CrossRef]
  2. Huang, Y.; Zhang, Y.; Xiao, H. Multi-robot system task allocation mechanism for smart factory. In Proceedings of the 2019 IEEE 8th Joint International Information Technology and Artificial Intelligence Conference (ITAIC), Chongqing, China, 24–26 May 2019; pp. 587–591. [Google Scholar] [CrossRef]
  3. von Tiesenhausen, J.; Artan, U.; Marshall, J.A.; Li, Q. Hand Gesture-Based Control of a Front-End Loader. In Proceedings of the 2020 IEEE Canadian Conference on Electrical and Computer Engineering (CCECE), London, ON, Canada, 30 August–2 September 2020; pp. 1–4. [Google Scholar] [CrossRef]
  4. Maosheng, T.; Xiaoqi, T.; Yong, Z. Implementation and design of open control system for industrial robot based on double-CPU. In Proceedings of the 2011 IEEE 2nd International Conference on Computing, Control and Industrial Engineering, Wuhan, China, 20–21 August 2011; pp. 298–301. [Google Scholar] [CrossRef]
  5. Azraai, M.A.M.; Yahaya, S.Z.; Chong, I.A.; Soh, Z.H.C.; Hussain, Z.; Boudville, R. Head Gestures Based Movement Control of Electric Wheelchair for People with Tetraplegia. In Proceedings of the 2022 IEEE 12th International Conference on Control System, Computing and Engineering (ICCSCE), Penang, Malaysia, 21–22 October 2022; pp. 163–167. [Google Scholar] [CrossRef]
  6. Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A.; Salih, M.M.; Lakulu, M.M.B. A Review on Systems-Based Sensory Gloves for Sign Language Recognition State of the Art between 2007 and 2017. Sensors 2018, 18, 2208. [Google Scholar] [CrossRef] [PubMed]
  7. Helmi, N.; Helmi, M. Applying a neuro-fuzzy classifier for gesture-based control using a single wrist-mounted accelerometer. In Proceedings of the 2009 IEEE International Symposium on Computational Intelligence in Robotics and Automation—(CIRA), Daejeon, Republic of Korea, 15–18 December 2009; pp. 216–221. [Google Scholar] [CrossRef]
  8. Niranjana, R.; Darney, P.E.; Narayanan, K.L.; Krishnan, R.S.; Fernando, A.V.; Robinson, Y.H. Prolific Sensor Glove based Communication Device for the Disabled. In Proceedings of the 2021 5th International Conference on Trends in Electronics and Informatics (ICOEI), Tirunelveli, India, 3–5 June 2021; pp. 636–640. [Google Scholar] [CrossRef]
  9. Nguyen, B.P.; Tay, W.-L.; Chui, C.-K. Robust Biometric Recognition from Palm Depth Images for Gloved Hands. IEEE Trans. Hum. -Mach. Syst. 2015, 45, 799–804. [Google Scholar] [CrossRef]
  10. Haratiannejadi, K.; Fard, N.E.; Selmic, R.R. Smart Glove and Hand Gesture-based Control Interface for Multi-rotor Aerial Vehicles. In Proceedings of the 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC), Bari, Italy, 6–9 October 2019; pp. 1956–1962. [Google Scholar] [CrossRef]
  11. Walker, C.R.; Nađ, Đ.; Antillon DW, O.; Kvasić, I.; Rosset, S.; Mišković, N.; Anderson, I.A. Diver-Robot Communication Glove Using Sensor-Based Gesture Recognition. IEEE J. Ocean. Eng. 2023, 48, 778–788. [Google Scholar] [CrossRef]
  12. Andreev, A.; Sutyrkina, K. Motion Stabilization Control of an Omni-Directional Mobile Robot with Four Wheels. In Proceedings of the 2023 5th International Conference on Control Systems, Mathematical Modeling, Automation and Energy Efficiency (SUMMA), Lipetsk, Russia, 8–10 November 2023; pp. 125–130. [Google Scholar] [CrossRef]
  13. Gao, H.; Wang, X.; Hu, J. Adaptive Tracking Control of Mobile Robots based on Neural Network and Sliding Mode Methods. In Proceedings of the 2023 38th Youth Academic Annual Conference of Chinese Association of Automation (YAC), Hefei, China, 19–21 May 2023; pp. 962–967. [Google Scholar] [CrossRef]
  14. Jongusuk, J.; Mita, T. Tracking control of multiple mobile robots: A case study of inter-robot collision-free problem. In Proceedings of the Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Republic of Korea, 21–26 May 2001; Volume 3, pp. 2885–2890. [Google Scholar] [CrossRef]
  15. Tsai, C.-Y.; Song, K.-T. Face Tracking Interaction Control of a Nonholonomic Mobile Robot. In Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9–15 October 2006; pp. 3319–3324. [Google Scholar] [CrossRef]
  16. Wang, K.; Zhang, Z.; Tang, X.; Liang, X.; Wang, X.; Yin, H. Optimal transmission technology of transmission line data based on heterogeneous wireless network. In Proceedings of the 2021 2nd International Seminar on Artificial Intelligence, Networking and Information Technology (AINIT), Shanghai, China, 26–28 November 2021; pp. 536–539. [Google Scholar] [CrossRef]
  17. Abitova, G.; Beisenbi, M.; Nikulin, V. Complex automation of a technological process on the basis of control systems with a three level structure. In Proceedings of the 2011 IEEE International Systems Conference, Montreal, QC, Canada, 4–7 April 2011; pp. 34–37. [Google Scholar] [CrossRef]
  18. Uzzaman, N.; Hossain, S.; Rashid, R.; Hossain, A. A comparative analysis between light dependent and ultrasonic method of gesture recognition. In Proceedings of the 2016 3rd International Conference on Electrical Engineering and Information Communication Technology (ICEEICT), Dhaka, Bangladesh, 22–24 September 2016; pp. 1–5. [Google Scholar] [CrossRef]
  19. Lu, L.Y.; Chang, Z.; Lu, Y.; Wang, Y. Development and kinematics/statics analysis of rigid-flexible-soft hybrid finger mechanism with standard force sensor. Robot. Comput.-Integr. Manuf. 2021, 67, 101978, ISSN 0736-5845. [Google Scholar] [CrossRef]
  20. Darapureddy, N.; Kurni, M.; Saritha, K. A Comprehensive Study on Artificial Intelligence and Robotics for Machine Intelligence. In Methodologies and Applications of Computational Statistics for Machine Intelligence; IGI Global: Hershey, PA, USA, 2021. [Google Scholar] [CrossRef]
  21. Available online: https://automatykaonline.pl/Artykuly/Robotyka/roboty-przemyslowe-o-rownoleglej-strukturze-kinematycznej (accessed on 10 May 2024).
  22. Liu, C.; Cao, G.-H.; Qu, Y.-Y. Workspace Analysis of Delta Robot Based on Forward Kinematics Solution. In Proceedings of the 2019 3rd International Conference on Robotics and Automation Sciences (ICRAS), Wuhan, China, 1–3 June 2019; pp. 1–5. [Google Scholar] [CrossRef]
  23. Le, M.-T.; Thuong, L.H.; Tung, P.T.; Pham, C.-T.; Nguyen, C.-N. Performance Evaluation of Fuzzy-PID and GA-PID Controllers on a 3-DOF Delta Robot Tracking Control. In Proceedings of the 2022 International Conference on Control, Robotics and Informatics (ICCRI), Danang, Vietnam, 16–18 December 2022; pp. 1–10. [Google Scholar] [CrossRef]
  24. Escobar, L.; Bolaños, E.; Bravo, X.; Comina, M.; Hidalgo, J.L.; Ibarra, A. Kinematic resolution of delta robot using four bar mechanism theory. In Proceedings of the 2017 IEEE International Conference on Mechatronics and Automation (ICMA), Takamatsu, Japan, 6–9 August 2017; pp. 881–887. [Google Scholar] [CrossRef]
  25. Tho, T.P.; Thinh, N.T.; Tuan, N.T.; Nhan, M.N.T. Solving inverse kinematics of delta robot using ANFIS. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Republic of Korea, 13–16 October 2015; pp. 790–795. [Google Scholar] [CrossRef]
  26. Gritsenko, I.; Seidakhmet, A.; Abduraimov, A.; Gritsenko, P.; Bekbaganbetov, A. Delta robot forward kinematics method with one root. In Proceedings of the 2017 International Conference on Robotics and Automation Sciences (ICRAS), Hong Kong, China, 26–29 August 2017; pp. 39–42. [Google Scholar] [CrossRef]
  27. Available online: https://vitalia.pl (accessed on 10 May 2024).
  28. Dudzik, S.; Podsiedlik, A.; Rapalski, A. Test stand to design control algorithms for mobile robots. Prz. Elektrotech 2021, 3, 97. [Google Scholar] [CrossRef]
  29. Crisnapati, P.N.; Maneetham, D.; Thwe, Y.; Aung, M.M. Enhancing Gimbal Stabilization Using DMP and Kalman Filter: A Low-Cost Approach with MPU6050 Sensor. In Proceedings of the 2023 11th International Conference on Cyber and IT Service Management (CITSM), Makassar, Indonesia, 24–25 August 2023; pp. 1–5. [Google Scholar] [CrossRef]
  30. Sultan, J.M.; Zani, N.H.; Azuani, M.; Ibrahim, S.Z.; Yusop, A.M. Analysis of Inertial Measurement Accuracy using Complementary Filter for MPU6050 Sensor. J. Kejuruter. 2022, 34, 959–964. [Google Scholar] [CrossRef]
  31. Paweł, K.; Krzus, J. The use of hand movement in space to control mechatronic devices-system design. Prz. Elektrotechniczny 2023, 99, 196–199. (In Polish) [Google Scholar]
  32. Available online: https://botland.store (accessed on 5 May 2024).
  33. Available online: https://ntronic.pl/modbus-tcp/ (accessed on 17 May 2024).
  34. Available online: https://automatykaonline.pl/Artykuly/Komunikacja/ethernet-przemyslowy-czesc-3-z-4-modbus-tcp-i-sercos-iii (accessed on 17 May 2024).
  35. Gomez, D.; Aguero, R.; Garcia-Arranz, M.; Ros, D. TCP Acknowledgement Encapsulation in Coded Multi-Hop Wireless Networks. In Proceedings of the 2014 IEEE 79th Vehicular Technology Conference (VTC Spring), Seoul, Republic of Korea, 18–21 May 2014; pp. 1–5. [Google Scholar] [CrossRef]
  36. Augusiak, A.; Pomykacz, S.; Redmerski, J. PLC Controller as a Modbus-IEC 61850 Communication Translator. Prz. Elektrotechniczny 2023, 99, 296–299. (In Polish) [Google Scholar]
Figure 1. Classification of industrial robots.
Figure 1. Classification of industrial robots.
Electronics 14 01150 g001
Figure 2. Delta robot kinematics [22].
Figure 2. Delta robot kinematics [22].
Electronics 14 01150 g002
Figure 3. The concept of the control system.
Figure 3. The concept of the control system.
Electronics 14 01150 g003
Figure 4. The control glove: (a) inner side of the glove [27]; (b) outside of the glove [27].
Figure 4. The control glove: (a) inner side of the glove [27]; (b) outside of the glove [27].
Electronics 14 01150 g004
Figure 5. Microcontroller requirements.
Figure 5. Microcontroller requirements.
Electronics 14 01150 g005
Figure 6. Modbus TCP/IP data frame [33].
Figure 6. Modbus TCP/IP data frame [33].
Electronics 14 01150 g006
Figure 7. Control algorithm.
Figure 7. Control algorithm.
Electronics 14 01150 g007
Figure 8. Information about the button status.
Figure 8. Information about the button status.
Electronics 14 01150 g008
Figure 9. Electrical connection diagram.
Figure 9. Electrical connection diagram.
Electronics 14 01150 g009
Figure 10. Control glove.
Figure 10. Control glove.
Electronics 14 01150 g010
Figure 11. The result of using the averaging filter.
Figure 11. The result of using the averaging filter.
Electronics 14 01150 g011
Figure 12. Comparison of filtration for different window sizes.
Figure 12. Comparison of filtration for different window sizes.
Electronics 14 01150 g012
Figure 13. Kalman filter.
Figure 13. Kalman filter.
Electronics 14 01150 g013
Figure 14. Control process: (a) initial position; (b) final position.
Figure 14. Control process: (a) initial position; (b) final position.
Electronics 14 01150 g014
Table 1. Comparison of industrial robots.
Table 1. Comparison of industrial robots.
RobotKinematic TypeFeaturesAdvantages
UR3e (Universal Robots)Serial kinematicsLightweight, compact design, precise, easy to programHigh flexibility, ease of programming, small size
ABB FlexPickerDelta kinematicsLightweight design, fast movements, precise manipulationHigh precision, speed, suitable for high hygienic environments
YK400 (Yamaha SCARA)SCARA kinematicsFast horizontal manipulation, high precisionFast, precise movements, high repeatability
Bosch Cartesian RobotsCartesian kinematicsSimple design, high accuracy, movement in three axesStability, ease of programming, high accuracy
Table 2. Specifications of MPU6050 [31].
Table 2. Specifications of MPU6050 [31].
SpecificationValue
Sensitivity+/−2 g, +/−4 g, +/−8 g, +/−16 g
Gyroscope operating ranges250°/s, 500°/s, 1000°/s, 2500°/s
Supply voltage3–5 V
Current consumption350 μA
Operates in the temperature range−40 to +85 degrees Celsius
InterfaceI2C
Dimensions25 mm × 13 mm
Table 3. Specifications of Raspberry Pi 3B+ [31].
Table 3. Specifications of Raspberry Pi 3B+ [31].
InterfaceFunction
Four USB connectorsAllows for the connection of various peripherals, such as a mouse or keyboard
Ethernet socketServes as a socket for communication with an external device via the Modbus protocol
WiFi moduleAllows communication with a 2.4 GHz and
5 GHz 802.11b/g/n/ac wireless network
Bluetooth 4.2 moduleEnables data transfer via a popular interface
GPIOOutputs/inputs that can be programmed for specific functions
Table 4. Hardware list.
Table 4. Hardware list.
DeviceFunction
ESP32Reading data from the sensor module and sending data via WiFi to Raspberry Pi 3B+
MPU6050 Tracking hand movements in the control glove
Raspberry Pi 3B+Receiving data sent from ESP32 and sending them to the robot controller using the Modbus protocol
OLED DisplayDisplaying information about the current hand position and system initialization
Table 5. Functions of buttons.
Table 5. Functions of buttons.
ButtonFunction
Button 1Sends system status information to an intermediary microcontroller, transmitting a complete data frame that includes details about the current position and the state of the other buttons
Button 2Activates the pneumatic suction cup, which serves as the TCP of the delta robot
Button 3Deactivates the pneumatic suction cup
Button 4Switches the mode from full mode to manual mode
Button 5In manual mode, changes the currently controlled axis
Table 6. Comparison of filters.
Table 6. Comparison of filters.
FilterAdvantagesDisadvantages
Averaging Filter
Effectively reduces high-frequency noise
Easy to apply
Loses dynamic signal information
Low effectiveness in the case of rapid changes in the signal
Moving Average Filter
Responds better to signal changes than the averaging filter
Can be adjusted for different window sizes
Larger windows cause increased delay in responses
Not optimal for highly variable signals
Kalman Filter
Statistically optimal
Minimizes mean squared error
Effective in dynamic, nonlinear systems
Involves advanced mathematics, making implementation challenging
Requires knowledge of system parameters
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Krzus, J.; Trawiński, T.; Kielan, P.; Szczygieł, M. Design and Implementation of a Tripod Robot Control System Using Hand Kinematics and a Sensory Glove. Electronics 2025, 14, 1150. https://doi.org/10.3390/electronics14061150

AMA Style

Krzus J, Trawiński T, Kielan P, Szczygieł M. Design and Implementation of a Tripod Robot Control System Using Hand Kinematics and a Sensory Glove. Electronics. 2025; 14(6):1150. https://doi.org/10.3390/electronics14061150

Chicago/Turabian Style

Krzus, Jakub, Tomasz Trawiński, Paweł Kielan, and Marcin Szczygieł. 2025. "Design and Implementation of a Tripod Robot Control System Using Hand Kinematics and a Sensory Glove" Electronics 14, no. 6: 1150. https://doi.org/10.3390/electronics14061150

APA Style

Krzus, J., Trawiński, T., Kielan, P., & Szczygieł, M. (2025). Design and Implementation of a Tripod Robot Control System Using Hand Kinematics and a Sensory Glove. Electronics, 14(6), 1150. https://doi.org/10.3390/electronics14061150

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop