Next Article in Journal
Semantic-Based Conflict Detection: Tampering Detection Research in Bilingual Scene Images Containing Textual Content
Previous Article in Journal
A Contemporary Algebraic Attributes of m-Polar Q-Hesitant Fuzzy Sets in BCK/BCI Algebras and Applications of Career Determination
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Visual Localization and Path Planning for a Dual-Arm Collaborative Pottery Robot

1
School of Art, Soochow University, Suzhou 215031, China
2
School of Mechanical and Electrical Engineering, Soochow University, Suzhou 215031, China
*
Authors to whom correspondence should be addressed.
Symmetry 2025, 17(4), 532; https://doi.org/10.3390/sym17040532
Submission received: 4 March 2025 / Revised: 27 March 2025 / Accepted: 27 March 2025 / Published: 31 March 2025
(This article belongs to the Section Computer)

Abstract

:
The convergence of art and technology represents an emerging trend, and it has become crucial to apply advanced technologies to traditional art forms in order to drive innovation in the way art is created, experienced, and understood. The aim of this paper is to explore new paths for the integration of art and technology through the development of a symmetric dual-arm collaborative ceramics robot, thereby facilitating the automation and technologization of ceramics creation. This study utilizes machine vision techniques and path planning algorithms. Specifically, machine vision technology is utilized to identify and locate the clay embryos and to complete the production of purple clay teapots through dual-arm collaboration. The successful application of the pottery robot proves the possibility of art-enabling technology, which helps to promote the innovative development of the entire art field and broaden the forms and boundaries of artistic expression.

1. Introduction

As a craft with a long history of human civilization, pottery transforms clay into works of artistic value through handcrafting, carving, and decorating techniques [1]. However, the traditional process of making pottery by hand is not only time-consuming and laborious, but also requires great skill from the artisans. With the progress of society and the development of technology, how to utilize modern technology to enhance the productivity, quality, and innovation of this traditional craft has become a new research hotspot. The dual-arm collaborative pottery robot based on visual localization and path planning proposed in this paper aims to integrate automation and intelligence into ceramic art creation, which not only improves the production efficiency and precision, but also preserves the unique charm and creativity of ceramic art.
Existing robotic technologies in ceramics focus on automated molding, where highly consistent and complex designs are produced by programmatically controlling robotic arms, and firing process optimization, where intelligent monitoring and automated kiln control systems are utilized to ensure the stability of the firing process. However, these technologies face limitations such as inflexibility, high cost, and lack of haptic feedback, which restrict application to non-standard or creative tasks and pose barriers to entry for small studios and individual artists. To overcome these issues, future directions include applying artificial intelligence and machine learning to improve the flexibility and creativity of creation, reducing system costs through technological advances and scaling, and developing new sensors to enhance the haptic sensing capabilities of robots to improve the precision and quality of operations.
Vision positioning technology enables the robot to accurately identify and locate every detail in the work area. Many attempts have been made by researchers in the field of visual recognition and control. By adopting an improved accelerated robust feature (SURF) algorithm, the efficiency and accuracy of feature point matching in the robot’s binocular vision system can be enhanced, thus strengthening the robot’s ability of visual localization and target recognition in complex environments [2,3]. In addition, the researchers proposed an optimization-based controller for accessible visual servo control when teleoperating a symmetric dual-arm robot [4,5].
Furthermore, the application of the kinematic inverse solution technique solves the problem of how the robot moves from one point to another, which is particularly important when performing fine manipulation in complex 3D space [6,7]. By calculating the target position, the robot can determine the optimal sequence of movements to complete the task in the most efficient way [8,9]. This means that even the most challenging ceramic works can be handled with ease by ceramic robots. Path planning, on the other hand, ensures that each step achieves the desired artistic effect while ensuring smoothness and efficiency of movement. Through pre-set or real-time adjusted paths, the robot is able to perfectly execute the processes of billeting, carving, glazing, etc., without damaging the work [10]. In addition, path planning can be personalized according to different creative needs, giving creators greater freedom and room for expression. For example, path planning is one of the core technologies to achieve autonomous navigation in autonomous driving technology in [11,12]; path optimization for unmanned aerial vehicle (UAV) distribution systems [13,14,15]; path planning for automated guided vehicles (AGVs) in warehousing and logistics [16]; and path planning for instruments in robotic surgery to ensure the safety and efficacy of minimally invasive surgery [17].
For dexterous hands, researchers have designed a novel integrated link-driven dexterous anthropomorphic hand (ILDA hand) [18]. The manipulator achieves multi-degree-of-freedom movement of the fingers through integrated parallel and tandem mechanisms, and high fingertip force and back driving capability through an efficient power transmission structure. Several other researchers have designed a soft manipulator [19,20] for delicate in-hand manipulation. By designing highly compliant soft fingers, they can perform basic action primitives even in the presence of uncertainty. For some precision assembly tasks, the researchers proposed a strategy and implementation method for “peg-in-hole” assembly tasks using a dual-arm robot and a dexterous manipulator [21].
Pottery robots can be used in a variety of fields to fulfill their unique value. For example, in education, it can be used for teaching demonstrations and student practice; in art creation, it can help artists realize novel designs; in industrial production [22], it can improve manufacturing efficiency and reduce costs; and in cultural heritage preservation, it can record and pass on traditional skills. In short, ceramic robot represents the deep integration of art and technology, which not only broadens the boundary of ceramic creation possibilities, but also provides artists with a new platform for self-expression and exploration of innovation.
Adopting a systematic research framework, this paper focuses on the design and realization of a two-arm collaborative pottery robot and is divided into five chapters. Section 1 is the introduction, which clarifies the research background, reveals the research gaps in the field of robotic art creation through a literature analysis, and establishes the research objectives. Section 2 is the theoretical modeling, including the kinematic modeling based on the Denavit–Hattenberg (D-H) parametric method and the construction of the 3D solid model. Section 3 is the methodology, including visual localization, trajectory planning with kinematic inverse solution algorithm, and dynamic coordinated control strategy for both arms. Section 4 implements simulation through the MATLAB platform (R2024b) to verify the program generation of the traditional tapping action, and evaluates the system accuracy based on the experimental data. Section 5 summarizes the innovative results, points out the technical advantages of the dual-arm collaboration mechanism in artistic creation, and proposes future research directions. The contributions of this paper are as follows:
  • Combine robots with pottery to fill the gap in pottery automation.
  • Achieve precise recognition of 3D stereoscopic machine vision and the combination of robotics and ceramics.
  • Proposes fully-automated ceramic production, promoting the industrialization of ceramic products and the inheritance of pottery-making skills.

2. Modeling and Analysis of the Dual-Arm Collaborative Pottery Robot

In this section, the requirements analysis of the ceramic robot will be explored in detail, leading to the completion of the equipment selection and modeling.

2.1. Demand Analysis

The pottery robot uses visual recognition technology to determine whether the clay blank is in the desired position. During the operation, both arms work in tandem, with one arm holding a special wooden racket to shape the purple clay teapot, while the other arm presses against the body of the pot from the inside to stabilize its shape. Throughout the entire process, the system continuously monitors the status of the clay blank and uses real-time feedback to assess the progress of the process and determine whether the required shaping process has been completed.
In the process of making purple clay teapots, patting the body barrel is a key part of shaping the pot, and this paper takes this process as the object of study and discusses it in detail [23]. In the purple clay teapot production process, after the initial shaping of the pot’s body contour, in the traditional craft, the maker will use a special wooden racket to roll out the body (i.e., the body barrel) to ensure that the shape of the purple clay teapot is precise and aesthetically pleasing. As shown in Figure 1, this process not only tests the skill of the maker, but also requires that the lines on the body of the pot must conform strictly to the design criteria. To do this, the maker holds two fingers of his left hand against the inside of the body to stabilize it and sense the strength of the tapping, while his right hand holds a special wooden racket and rhythmically taps along the body in a circular motion, gradually bringing it to its ideal shape.
The technique of tapping is at the heart of this process. The direction, strength, and frequency of the tapping need to be precisely adjusted to meet the requirements of the particular design of the pot. Typically, the tapping starts at the base of the pot and gradually moves upwards, creating a graceful curve and appropriate thickness of the body through even and powerful tapping, ensuring that each part of the pot meets the design criteria. Force control is also critical in the tapping process. The force of the tap needs to be just right, to ensure that the clay is sufficiently compacted to make the body of the pot tightly structured, but not so hard as to cause deformation or breakage of the pot. Throughout the process, the maker needs to keep an eye on the changes in the body and adjust the force of the patting according to the characteristics of the clay to ensure the quality and integrity of the pot.
Remark 1.
Based on the specific needs of the tapping operation in purple clay teapot making, i.e., one hand is used to tap the clay embryo for shaping while the other hand supports it from the inside, this paper proposes an innovative two-armed collaborative approach. Two Intel RealSense D435i depth cameras are selected as the visual input device to continuously localize the position of the clay embryo. In particular, a conventional model predictive control (MPC) approach is used for path planning to keep the arms moving on a predefined optimal trajectory.

2.2. D-H Model of the Pottery Robot

Both arms of the pottery robot possess 6DoF. Each joint is named as follows: shoulder, upper arm, elbow, forearm, wrist, and end-effector, with corresponding internal drive mechanisms designed for these joints. The kinematics of the arm is modeled using D-H parameters, a standardized method for describing the relative position and orientation relationships between the linkages and joints in a mechanical arm [24,25,26,27]. The D-H model describes the relative positions and orientations between adjacent coordinate systems by establishing a coordinate system at each joint and using a chi-square transformation matrix [28,29]. The D-H model is particularly suitable for solving both forward and inverse kinematics.
Firstly, the concepts of joints and connecting rods are clarified. The rotating or moving parts in a robot arm are called joints, and the rigid part connecting two joints is called connecting rod. When establishing a coordinate system, it is first necessary to ensure that each link has a local coordinate system, usually denoted as i. The base coordinate system (Link 0) is fixed to the base of the robot arm and serves as the reference coordinate system for the whole system. Next, the direction of the Z-axis is determined first, with the Z 0 axis usually coinciding with the first joint axis and fixed to the base; for the other links, the Z i axis is the axis of rotation of each joint, pointing from joint i to joint i + 1. Subsequently, the direction of the X-axis is determined, with the X i axis being perpendicular to the direction of the common perpendiculars of the Z i 1 and Z i axes, and pointing from the Z i 1 axis to the Z i axis. Finally, according to the right-hand rule, the direction of the Y-axis is determined from the X i and Z i axes. The specific coordinate system is shown in Figure 2. The transformation matrix for each joint of the mechanical arm is defined by four parameters: joint angle θ , link length a, link twist α , and link offset d. Together, these parameters determine the transformation relationship between two adjacent joints. The UR5 robot arm is selected for this symmetric dual-arm pottery robot, and the linkage coordinate system is established based on the above method to obtain the four parameters of the D-H model, as shown in Table 1.
These D-H data were input into the robotics toolbox in MATLAB for simulation and visualization to obtain the validation model of the mechanical arm in Section 4.2 and Section 4.3.

2.3. 3D Solid Model of the Pottery Robot

The pottery robot presented in this paper possesses a unique dual-arm structure, and its UR5 mechanical arm is selected to cover the range of motion of simple pottery production application scenarios, which can achieve efficient automated production [30]. In addition, in view of the important role of machine vision in the pottery production process, the robot’s torso is specially reserved for placing the camera area, as the robot’s eyes, to provide the robot with accurate visual perception capabilities. The material used for the body support is cast iron, because the body must be strong enough to support the two mechanical arms to ensure the stability and safety of the robot during operation, as shown in Figure 3.
For the pottery robot, the flexibility of the end-effector is crucial. In the process of pottery creation, various raw materials, such as alabaster clay, need to be finely shaped. In this paper, Robotiq gripper jaws are chosen as the end actuator in Figure 4, which can flexibly perform various operations such as pinching, kneading, pulling, and pressing in pottery, and satisfy the high requirements for details and artistry in the production of ceramics.
According to the above description of the pottery robot, this paper used S o i l d w o r k s to carry out three-dimensional modeling, mainly including the base, arm, eye, and end-effector, the specific effect of which is shown in Figure 5. The range of motion of the UR5 covers simple pottery making application scenarios and allows for efficient automated production. Specific parameters are shown in Table 2. This pottery robot integrates a binocular vision system that allows the two mechanical arms to work in tandem to accurately make pottery.

3. Methodology for the Dual-Arm Collaborative Pottery Robot

This section explores the basic concepts of robot visual localization, pose and trajectory planning, as well as solving the dynamics model of the UR5 mechanical arm.

3.1. Visual Localization

Visual localization aims to establish an accurate correspondence between the 2D position of an object in an image and its 3D position in the real environment [31,32]. Through high-precision cameras and image processing algorithms, the coordinate positions and angles of the object and the end of the mechanical arm are accurately acquired, enabling the mechanical arm to perform precise operations such as grasping and tapping. Vision systems have the problem of ensuring localization accuracy under different lighting conditions and in the presence of occlusions [6,33]. This system uses a multi-camera system to collect data from different angles [34], so that even if a certain viewpoint is occluded, other viewpoints can still provide effective information. This approach not only improves the robustness of the system, but also enhances the ability to understand complex scenes. See Section 4.1.
In the process of achieving this conversion, hand–eye calibration technology plays a key role. Hand–eye calibration is a step to determine the relationship between the camera coordinate system and the mechanical arm coordinate system, which is the basis for achieving visual positioning guidance. Through hand–eye calibration, we can map the image coordinate system to the mechanical arm coordinate system to achieve accurate positioning and operation. As shown in Figure 6. The principle of the symmetric dual-arm collaborative pottery robot in visual localization mainly involves several key steps such as image acquisition, feature extraction, position calculation and motion control, and the specific details of these steps are as follows.
  • Image acquisition: the working area is photographed by cameras (in this case, two Realsense D435i cameras are used) mounted on the robot. These cameras capture real-time images of the pottery making process, including information such as the shape and color of the clay and the position of the tools.
  • Feature extraction: the acquired image data are processed using computer vision algorithms to identify and extract feature points of interest. For pottery, these features may include the contours of the clay, the surface texture, or specific decorative elements. This process usually involves techniques such as edge detection, color analysis, and pattern matching [5,35,36].
  • Position computation: based on the extracted feature information, geometric molding or template matching methods will be used to determine the precise position and attitude of the clay and its associated tools with respect to the robot. This step needs to take into account the calibration parameters of the camera, i.e., the internal and external parameters of the camera, to ensure an accurate conversion from 2D image to 3D spatial coordinates.
  • Motion control: based on the calculated target position information, the control system generates commands to the two-armed collaborative robots, instructing them how to move in order to complete the given operation tasks, such as shaping, sculpting, or glazing. In order to achieve precision, a force feedback mechanism needs to be considered so that the robot can adapt to changes in the hardness or softness of the clay and make appropriate adjustments.
Throughout the process, the precision and response speed of the vision positioning system directly affects the quality of the finished pottery product. With the development of artificial intelligence technology, modern ceramic robots can also learn different styles of pottery and improve their creativity by continuously optimizing their operating strategies.

3.2. Trajectory Planning

After establishing the D-H parametric model and 3D solid model of the dual-arm robot, an analytical method is used to compute the kinematic forward and inverse solutions of the 6DoF dual-arm structure, and trajectory planning is completed in the Cartesian coordinate system [37]. By this method, the relationship between the motion of end-effector of the dual arms and the activities of each joint can be accurately described to ensure the coordination and accuracy of the movements when the dual arms collaborate. Ultimately, the analytical method is used to solve the kinematic inverse problem of the mechanical arm, which provides theoretical support for efficient motion planning and coordinated control of the symmetric dual-arm collaborative pottery robot.

3.2.1. Kinematic Model

The study of the kinematic model of a dual-arm robot is the key to achieving its motion planning and coordinated control. Although the dual-arm robot is structurally composed of two single-arm robots, its kinematic model cannot be simply regarded as a superposition of two single-arm robots. This is because there is a physical coupling between the two arms during the motion process, and this coupling relationship makes the kinematic analysis more challenging. In robot kinematics, the rotation matrix obtained by rotation and translation can be expressed as
Rot x i 1 , a i = 1 0 0 0 0 cos α i sin α i 0 0 sin α i cos α i 0 0 0 0 1 ,
Trans x i 1 , a i = 1 0 0 a i 0 1 0 0 0 0 1 0 0 0 0 1 ,
Rot z i 1 , θ i = cos θ i sin θ i 0 0 sin θ i cos θ i 0 0 0 0 1 0 0 0 0 1 ,
Trans z i 1 , d i = 1 0 0 0 0 1 0 0 0 0 1 d i 0 0 0 1 .
Based on the D-H model parameters introduced in Section 2.2, the transformation matrix from link i 1 to link i is established as
T i 1 i = R o t ( x i , α i 1 ) × T r a n s ( x i , a i 1 ) × R o t ( z i , θ i ) × T r a n s ( z i , d i ) = cos θ i sin θ i 0 a i 1 sin θ i cos α i 1 cos θ i cos α i 1 sin α i 1 sin α i 1 d i sin θ i sin α i 1 cos θ i sin α i 1 cos α i 1 cos α i 1 d i 0 0 0 1 .
Calculate the total transformation matrix and multiply all the transformation matrices together to obtain the total transformation matrix from the base coordinate system to the end-effector coordinate system
T 0 6 = T 0 1 · T 1 2 · T 2 3 · T 3 4 · T 4 5 · T 5 6 .

3.2.2. Kinematic Inverse Solutions

Mechanical arm inverse kinematics is the process of solving for the angles of the joints when the end-effector position is known. Common solution methods include numerical, analytical, and geometric methods [38]. The geometric method is suitable for the simple structure of the mechanical arm, but it is more difficult to solve the complex structure; the numerical method solves the nonlinear equations through intelligent algorithms, but the stability and accuracy are difficult to ensure; the analytical method solves the joint angles through algebraic operations. The pottery robot adopts the UR5 mechanical arm, which is composed of moving joints and rotating joints, and a numerical solution usually exists. In this paper, a combination of geometric and analytical methods are used to solve the inverse kinematics equations [39]. According to the general equation of the transformation matrix from i 1 link to i link given in Section 3.2.1, the D-H parameters are substituted into the general equation of the transformation matrix, and the six transformation matrices are obtained as
T 0 1 = cos θ 1 sin θ 1 0 0 sin θ 1 cos θ 1 0 0 0 0 1 d 1 0 0 0 1 , T 1 2 = sin θ 2 cos θ 2 0 0 0 0 1 0 cos θ 2 sin θ 2 0 0 0 0 0 1 , T 2 3 = cos θ 3 sin θ 3 0 a 3 sin θ 3 cos θ 3 0 0 0 0 1 0 0 0 0 1 , T 3 4 = sin θ 4 cos θ 4 0 a 4 cos θ 4 sin θ 4 0 0 0 0 1 d 4 0 0 0 1 , T 4 5 = cos θ 5 sin θ 5 0 0 0 0 1 d 5 sin θ 5 cos θ 5 0 0 0 0 0 1 , T 5 6 = cos θ 6 sin θ 6 0 0 sin θ 6 cos θ 6 1 0 0 0 1 d 6 0 0 0 1 .
The kinematic equation for the chi-square transformation matrix from the end-effector coordinate system { 6 } of the mechanical arm to the base coordinate system { 0 } is
T 6 0 = T 1 0 · T 2 1 · T 3 2 · T 4 3 · T 5 4 · T 6 5 = n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 = R 0 6 P b 0 1 .
To solve θ 5 , multiplying left by T 1 1 0 on both sides of Equation (8)
T 6 1 = T 1 1 0 T 6 0 = cos θ 1 sin θ 1 0 0 sin θ 1 cos θ 1 0 0 0 0 1 0 0 0 0 1 n x o x a x p x n y o y a y p y n z o z a z p z 0 0 0 1 .
From Equation (9) and sin θ 1 p x + cos θ 1 p y = 0 , it is possible to obtain
θ 1 = tan 1 ( p y p x ) ,
or
θ 1 = θ 1 + 180 .
The inverse kinematics of a planar two-link mechanical arm is shown in Figure 7. If the β angle is restricted to the interval [0, π ], according to the cosine theorem, it is obtained as
L 1 2 + L 2 2 2 L 1 L 2 cos β = x 2 + y 2 .
From this it follows:
β = cos 1 ( L 1 2 + L 2 2 x 2 y 2 2 L 1 L 2 ) .
Similarly, α can be obtained by the cosine theorem, which is
α = cos 1 ( L 1 2 L 2 2 + x 2 + y 2 2 L 1 x 2 + y 2 ) .
Define γ = atan2 [ y , x ] and the rightward solution of the kinematic inverse is
θ 2 = γ α , θ 3 = π β ,
while the leftward solution can be written as
θ 2 = γ + α , θ 3 = π + β .
Since the UR5 mechanical arm links 1, 2, and 3 studied in this paper are all in one plane, after solving to obtain θ 3 , θ 4 can also be obtained by the cosine theorem
θ 4 = cos 1 ( d 34 2 + d 45 2 d 35 2 2 d 34 d 35 ) .
Since θ 1 , θ 2 , θ 3 , and θ 4 have been found, the inverse matrices of matrices T 1 1 0 , T 1 2 1 , T 1 3 2 , and T 1 4 3 are known. To find θ 5 , multiplying left by T 1 4 3 on both sides
T 1 4 3 T 1 3 2 T 1 2 1 T 1 1 0 = T 5 4 .
Through solving Equation (18), θ 5 is easily obtained as
θ 5 = tan 1 ( c 234 cos θ 1 a x + sin θ 1 a y + s 234 a z sin θ 1 a x cos θ 1 a y ) ,
where c 234 = cos θ 4 cos ( θ 3 + θ 2 ) sin θ 4 sin ( θ 3 + θ 2 ) , s 234 = cos θ 4 sin ( θ 3 + θ 2 ) sin θ 4 cos ( θ 3 + θ 2 ) .
Similarly, θ 6 can be obtained as
θ 6 = tan 1 ( s 234 ( cos θ 1 n x + sin θ 1 n y ) + c 234 n z s 234 ( cos θ 1 o x + sin θ 1 o y ) + c 234 o z ) .
Observation of the inverse kinematic equations shows that, given a mechanical arm end position, there can be eight different sets of joint solutions. Due to the existence of singular positions in UR5 and the existence of a common workspace when the arms work together, some of the solutions cannot satisfy the requirements, which will be discussed in detail in Section 4.2.

3.2.3. Cartesian Spatial Trajectory Planning

Under the premise that its own structure is stable and there are no strange phenomena, the trajectory planning of the mechanical arm from the initial position to the target position reflects its motion law, which is crucial for completing the task [40]. This paper adopts Cartesian space for trajectory planning [41]. Although the method is computationally complex and the control process is cumbersome, its motion accuracy, adaptability, and stability are higher in the mechanical arm operation task.
In Cartesian spatial linear planning, the positions and poses of the start and end points of a straight line are known, and it is necessary to compute the poses of each trajectory interpolation point. Usually, the robot moves along a straight line with unchanged pose and no pose interpolation is required [42]. If the attitude needs to be changed, attitude interpolation is performed. Knowing the position and attitude of the start point M and the end point N relative to the base coordinate system, as shown below, and assuming that the desired velocity along the straight line is v and the interpolation time interval is t, the length of the straight line L, the distance d in the time interval t, and the number of interpolations required N can be calculated.
L = ( x 2 x 1 ) 2 + ( y 2 y 1 ) 2 + ( z 2 z 1 ) 2 , d = v · t , N = L d + 1 .
The increment of each axis of the neighboring interpolation point is obtained as follows:
Δ x = x 2 x 1 N , Δ y = y 2 y 1 N , Δ z = z 2 z 1 N .
The coordinate values of the interpolation points for each axis are as follows:
x i + 1 = x i + i Δ x , y i + 1 = y i + i Δ y , z i + 1 = z i + i Δ z .
Any non-linear three points in three-dimensional space can determine a spatial circle; in the process of mechanical arm movement, in addition to the straight line, the circle trajectory is also a kind of basic trajectory, and through the combination of the circle and the straight line, other complex trajectories can be easily obtained. Consider three points P 1 , P 2 , P 3 in space that are not co-linear, as shown in Figure 8b.
First, find the center of the circle P 0 ( x 0 , y 0 , z 0 ) and its radius. The general equation of the plane in space is known as
A 0 x + B 0 y + C 0 z + D 0 = 0 .
An alternative form of the equation of this plane can be obtained from the coordinates of the three points.
x x 3 y y 3 z z 3 x 1 x 3 y 1 y 3 z 1 z 3 x 2 x 3 y 2 y 3 z 2 z 3 = 0 .
Correspondence with the general equation can be obtained as
A 0 = ( y 1 y 3 ) ( z 2 z 3 ) ( y 2 y 3 ) ( z 1 z 3 ) , B 0 = ( x 2 x 3 ) ( z 1 z 3 ) ( x 1 x 3 ) z 2 z 3 , C 0 = ( x 1 x 3 ) ( y 2 y 3 ) ( x 2 x 3 ) ( y 1 y 3 ) , D 0 = ( A x 3 + B y 3 + C z 3 ) .
The equation of the plane T passing through the midpoints of P 1 , P 2 and perpendicular to P 1 , P 2 is
A 1 x + B 1 y + C 1 z + D 1 = 0 ,
where
A 1 = x 2 x 1 , B 1 = y 2 y 1 , C 1 = z 2 z 1 , D 1 = ( x 2 2 x 1 2 ) + ( y 2 2 y 1 2 ) + ( z 2 2 z 1 2 ) 2 .
Correspondingly, equation of the plane S passing through the midpoints of P 2 , P 3 and perpendicular to P 2 , P 3 is
A 2 x + B 2 y + C 2 z + D 2 = 0 ,
where
A 2 = x 3 x 2 , B 2 = y 3 y 2 , C 2 = z 3 z 2 , D 2 = ( x 3 2 x 2 2 ) + ( y 3 2 y 2 2 ) + ( z 3 2 z 2 2 ) 2 .
The center of the spatial circle is the intersection of the planes M, T, and S. Associating the three plane equations, then the equations of P 1 , P 2 , P 3 of the external circle of the center of the circle coordinates P 0 = [ x 0 y 0 z 0 ] can be listed
A 0 B 0 C 0 A 1 B 1 C 1 A 2 B 2 C 2 x 0 y 0 z 0 = D 0 D 1 D 2 .
It is possible to obtain the coordinates of the center of the circle
x 0 y 0 z 0 = k 11 k 12 k 13 k 21 k 22 k 23 k 31 k 32 k 33 1 k 14 k 24 k 34 ,
and radius
r = ( x 1 x 0 ) 2 + ( y 1 y 0 ) 2 + ( z 1 z 0 ) 2 .

4. Case Study

4.1. Visual Calibration

Two Realsense D435i cameras were first calibrated. The calibration process involves acquiring calibration images and calculating the internal and external parameters of the cameras. In this paper, a 27 × 27 checkerboard grid is used as the calibration board, as shown in Figure 9. There are four steps in camera calibration [43].
  • The images acquired by the camera are binarized to identify the black board regions (i.e., quadrangles) at the locations of the candidate board’s corner points.
  • In a subsequent filtering step, only those quadrilaterals that meet a specific size requirement are retained and arranged in a regular grid structure that has user-specified dimensions.
  • After the initial inspection of the calibration plate has been completed, the location of the corner points can be determined with a very high degree of accuracy. This is because corner points (mathematically known as saddle points) are geometrically close to infinitely small points and therefore remain unbiased despite perspective shifts or lens distortion.
  • After successful detection of the checkerboard grid, further sub-pixel level refinement can be performed to pinpoint the location of the saddle point. This process makes use of the exact gray values of the pixels surrounding the corner point, thus achieving sub-pixel level accuracy that is higher than whole-pixel accuracy.
Figure 10 shows the main steps of visual localization. During the calibration process, standard camera calibration methods were used to correct radial and tangential aberrations, taking into account lens aberrations. This ensures the accuracy of the visual input. Here, 12 sets of calibration plate images were taken, as shown in Figure 11. The calibration results shown in Figure 12 were obtained by 3D reconstruction.

4.2. Dual-Arm Cooperative Trajectory Control of Ceramic Work

A dual-arm collaborative system takes into account spatial constraints and singular configurations. When planning a dual-arm task, an optimization algorithm is used to reasonably allocate the work content of each robotic arm to ensure that there is no physical conflict between the two arms and to minimize the possibility of entering a singular region.
The D-H model of UR5 is imported into MATLAB. In Figure 13, the dark blue and beige intertwined space represents the workspace shared by the two mechanical arms. Within this shared workspace, the validity of the joint angles derived from the inverse solution for each robot needs to be evaluated individually. This means that, for each mechanical arm, it must be ensured that its joint angles obtained by inverse solving can be effectively applied in practice to achieve the intended movements and tasks. This individual consideration is necessary to ensure accuracy and efficiency when the arms collaborate, and to avoid operational errors or physical constraint conflicts due to unreasonable joint angles. To design the motion trajectory of the tapping action for the pottery robot, define a series of key points that represent the positions that the mechanical arm needs to reach when completing a specific task (i.e., tapping the body of the pot). For best results, these trajectories should be as smooth and continuous as possible, avoiding sudden changes or pauses, which will ensure the smoothness and uniformity of the tapping action.
Remark 2.
The joint space splicing may have constraint problems when the mechanical arm completes the flapping motion, such as the need to reduce the speed when turning sharply. The smoothness problem during the motion is considered during trajectory planning, and the joint space adaptive transition method is used here. The end of the mechanical arm starts from point A, runs normally in the red working section, and when there is a sharp turn, appropriate control signals are added at the control point, so that the end moves slowly along the blue transition section, and finally reaches point B. The end of the mechanical arm starts from point A, and runs normally in the red working section, as shown in Figure 14.
Specifically, the left arm of the pottery robot is responsible for holding the inside of the pot in place while the right arm is equipped with a pottery wooden patting tool to shape the pot as it is rotated on the turntable. Based on a pre-programmed trajectory, the right arm mimics a human pottery’s technique, applying meticulous pressure to shape the desired basic form, as shown in Figure 15. At this time, the left arm not only has to ensure the stable rotation of the clay, but also needs to adjust the speed and strength according to the operation of the right arm, to ensure that the clay is evenly stressed throughout the process, to avoid deformation or cracking of the work due to excessive local stress.

4.3. Results Analysis

The analysis of the kinetic model provides insight into the fact that up to eight different sets of joint angle combinations can theoretically be computed for a given mechanical arm end position (i.e., position and attitude). This is because the inverse kinematic solution of the mechanical arm may not be unique; specifically, multiple different joint angle configurations may result in the end of the mechanical arm being in exactly the same position and attitude.
The simulation is carried out in a Cartesian coordinate system, where the spatial position of the purple clay teapot to be made is known, and the inverse kinematics solution is used to find the joint angles required to reach the target position. The right arm of the pottery robot moves along a setting trajectory, and the left arm is fixed.
Figure 16 and Figure 17 show the motion characteristics of the right arm of the ceramic robot during the tapping of the clay blank. The joint velocity gradually decreases from 0 to −12, indicating that the robot arm continuously decelerates when approaching the clay billet to ensure the precision and soft contact of the tapping action, avoiding deformation or damage of the billet due to too-fast speed; then, the velocity value suddenly jumps to 50 and 100, which corresponds to the rapid retracting action of the robot arm after completing the tapping to enhance the operation efficiency by accelerating away from the billet and realizing high-frequency reciprocating motion. The high absolute value of ±40 in the acceleration data reflects the rapid braking of the mechanical arm at the moment of contact and the explosive acceleration at the time of disengagement, which is dynamically adjusted to balance the movement precision and efficiency requirements, while the 50 and 100 at the end may point to the reset stage or the parameter setting of multi-axis co-control. The value in the trajectory data fluctuates back up after dropping from 0.5 to −2. Combined with the X and Y coordinate markers, it shows that the mechanical arm moves along a complex curve in three-dimensional space, such as the negative Y-axis offset when depressing, and the numerical value picks up when raising, which may apply force evenly through a non-linear path to avoid local over-tapping. Overall, the mechanical arm is controlled by the segmented speed of decelerating approaching and accelerating away, combined with dynamic acceleration adjustment and spatial curve planning, which simulates the flexibility and randomness of manual patting, which not only guarantees the uniform texture of the ceramic billet molding, but also improves the efficiency of automated operation.

5. Conclusions and Future Work

Focusing on the theoretical gaps and technical shortcomings existing in the art robotics field, this paper introduces the concept of pottery robotics, realizes the use of a mechanical arm for pottery creation, and organically combines robotics with pottery art, thereby opening up a new path for pottery creation. The research process involved several tasks. Firstly, a theoretical model was constructed, including kinematic modeling based on the D-H parametric method and the building of a 3D solid model. Secondly, a methodology system was proposed, covering visual localization, inverse kinematics algorithm trajectory planning based on the vector differential method to calculate the Jacobian matrix of a 6DoF dual-arm structure, and dynamic coordinated control strategies for the dual arms. Finally, simulations were conducted using the MATLAB platform to verify the procedural generation of traditional slapping motions, and system accuracy was evaluated based on experimental data, which comprehensively validated the feasibility and effectiveness of the proposed method.
This paper plays an important role at both the theoretical and practical levels. Theoretically, it enriches the theoretical framework in the field of robotic art creation, provides new theoretical perspectives and research methods for integrating art, science, and technology, and promotes interdisciplinary development. Practically, it verifies the feasibility of robots paired with machine vision for precision operations, offers new ideas and technical support for the automation of future art creation, helps promote the deep integration of art and science and technology, improves the precision and efficiency of ceramic art production, and demonstrates the potential application of robotics in art creation.
Despite achieving certain results in the field of pottery robotics, there are some limitations. For instance, the stability and flexibility of the dynamic coordinated control strategy for both arms need enhancement when handling complex pottery movements; there is a gap between simulation on the MATLAB platform and actual applications, and hardware compatibility and real-time issues in practical pottery creation need to be addressed.
To address the above issues, future work could focus on the following areas:
  • Deploy the simulation to actual hardware. Combined with visual localization, the pottery robot is allowed to autonomously complete the production of the purple clay teapots.
  • Optimize the machine vision algorithm, introduce advanced technologies such as deep learning, and improve the recognition accuracy of the clay embryo and the tapping effect.
  • Improve the research of dynamic coordinated control strategy of both arms and adopt adaptive motion planning to enhance the flexibility and response speed of the system in a dynamic environment.
  • A torque sensor is integrated into the end-effector of the robotic arm to monitor in real time the distribution of force when in contact with the clay, thus dynamically adjusting the movements of both hands.
Through these efforts, it is expected to promote the further development of ceramic robotics, bring more possibilities for artistic creation, and fill the gaps in robotics at the artistic level.

Author Contributions

Conceptualization, W.Z., X.F. and Y.C.; methodology, W.Z. and X.F.; software, X.F.; validation, Y.C. and G.W.; formal analysis, T.W. and G.W.; investigation, L.C. and T.W.; resources, G.W., L.C. and T.W.; data curation, Wang; writing—original draft preparation, W.Z. and L.C.; writing—review and editing, T.W. and Y.C.; visualization, X.F. and G.W.; supervision, T.W. and Y.C.; project administration, G.W.; funding acquisition, W.Z. and Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by National Natural Science Foundation of China grant number 62103293, Natural Science Foundation of Jiangsu Province grant number BK20210709, Youth Project of Art Studies funded by the National Social Science Fund of China 23CF195, General Projects of Philosophy and Social Sciences Research in Jiangsu Colleges and Universities 2020SJA1352, General Projects funded by the Jiangsu Provincial Social Science Fund 22YSD005.

Data Availability Statement

The original contributions presented in this study are included in the article material. Further inquiries can be directed to the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ILDAIntegrated Linkage-driven Dexterous Anthropomorphic
DoFDegree of Freedom
SURFSpeeded Up Robust Feature
MPCModel Predictive Control
D-HDenavit-Hartenberg

References

  1. Torres, R.; Ferreira, N. Robotic manipulation in the ceramic industry. Electronics 2022, 11, 4180. [Google Scholar] [CrossRef]
  2. Tao, H.; Wang, P.; Chen, Y.; Stojanovic, V.; Yang, H. An unsupervised fault diagnosis method for rolling bearing using STFT and generative neural networks. J. Frankl. Inst. 2020, 357, 7286–7307. [Google Scholar]
  3. Sheng, H.; Wei, S.; Yu, X.; Tang, L. Research on Binocular Visual System of Robotic Arm Based on Improved SURF Algorithm. IEEE Sens. J. 2020, 20, 11849–11855. [Google Scholar]
  4. Nicolis, D.; Palumbo, M.; Zanchettin, A.M.; Rocco, P. Occlusion-Free Visual Servoing for the Shared Autonomy Teleoperation of Dual-Arm Robots. IEEE Robot. Autom. Lett. 2018, 3, 796–803. [Google Scholar]
  5. Fan, J.; Zheng, P.; Li, S. Vision-based holistic scene understanding towards proactive human–robot collaboration. Robot. Comput.-Integr. Manuf. 2022, 75, 102304. [Google Scholar]
  6. Yu, X.; He, W.; Li, Q.; Li, Y.; Li, B. Human-Robot Co-Carrying Using Visual and Force Sensing. IEEE Trans. Ind. Electron. 2021, 68, 8657–8666. [Google Scholar]
  7. Wang, G.; Li, Z.; Weng, G.; Chen, Y. An optimized denoised bias correction model with local pre-fitting function for weak boundary image segmentation. Signal Process. 2024, 220, 109448. [Google Scholar]
  8. Huang, P.; Gu, Y.; Li, H.; Yazdi, M.; Qiu, G. An Optimal Tolerance Design Approach of Robot Manipulators for Positioning Accuracy Reliability. Reliab. Eng. Syst. Saf. 2023, 237, 109347. [Google Scholar]
  9. Yang, C.; Lu, W.; Xia, Y. Positioning Accuracy Analysis of Industrial Robots Based on Non-Probabilistic Time-Dependent Reliability. IEEE Trans. Reliab. 2024, 73, 608–621. [Google Scholar]
  10. Cheng, C.; Zhang, H.; Sun, Y.; Tao, H.; Chen, Y. A cross-platform deep reinforcement learning model for autonomous navigation without global information in different scenes. Control Eng. Pract. 2024, 150, 105991. [Google Scholar]
  11. Qin, H.; Shao, S.; Wang, T.; Yu, X.; Jiang, Y.; Cao, Z. Review of Autonomous Path Planning Algorithms for Mobile Robots. Drones 2023, 7, 211. [Google Scholar] [CrossRef]
  12. Chu, Z.; Wang, F.; Lei, T.; Luo, C. Path Planning Based on Deep Reinforcement Learning for Autonomous Underwater Vehicles Under Ocean Current Disturbance. IEEE Trans. Intell. Veh. 2023, 8, 108–120. [Google Scholar]
  13. Messaoudi, K.; Oubbati, O.S.; Rachedi, A.; Lakas, A.; Bendouma, T.; Chaib, N. A survey of UAV-based data collection: Challenges, solutions and future perspectives. J. Netw. Comput. Appl. 2023, 216, 103670. [Google Scholar]
  14. Cui, S.; Chen, Y.; Li, X. A Robust and Efficient UAV Path Planning Approach for Tracking Agile Targets in Complex Environments. Machines 2022, 10, 931. [Google Scholar] [CrossRef]
  15. Zendehdel, N.; Chen, H.; Song, Y.S.; Leu, M.C. Implementing Eye Movement Tracking for UAV Navigation. In Proceedings of the International Symposium on Flexible Automation, Seattle, DC, USA, 21–24 July 2024; Volume 87882, p. V001T08A004. [Google Scholar]
  16. Yang, G.; Li, M.; Gao, Q. Multi-Automated Guided Vehicles Conflict-Free Path Planning for Packaging Workshop Based on Grid Time Windows. Appl. Sci. 2024, 14, 3341. [Google Scholar] [CrossRef]
  17. de Cássio Zequi, S.; Ren, H. Robotic motion planning in surgery. In Handbook of Robotic Surgery; Academic Press: Cambridge, MA, USA, 2025; pp. 169–178. [Google Scholar]
  18. Kim, U.; Jung, D.; Jeong, H.; Park, J.; Jung, H.M.; Cheong, J.; Choi, H.R.; Do, H.; Park, C. Integrated Linkage-Driven Dexterous Anthropomorphic Robotic Hand. Nat. Commun. 2021, 12, 7177. [Google Scholar]
  19. Ma, P.; He, X.; Chen, Y.; Liu, Y. ISOD: Improved small object detection based on extended scale feature pyramid network. Vis. Comput. 2025, 41, 465–479. [Google Scholar]
  20. Abondance, S.; Teeple, C.B.; Wood, R.J. A Dexterous Soft Robotic Hand for Delicate In-Hand Manipulation. IEEE Robot. Autom. Lett. 2020, 5, 5502–5509. [Google Scholar]
  21. Lee, D.H.; Choi, M.S.; Park, H.; Jang, G.R.; Park, J.H.; Bae, J.H. Peg-in-Hole Assembly with Dual-Arm Robot and Dexterous Robot Hands. IEEE Robot. Autom. Lett. 2022, 7, 8566–8573. [Google Scholar]
  22. Chen, Y.; Chu, B.; Freeman, C.T. Point-to-Point Iterative Learning Control with Optimal Tracking Time Allocation. IEEE Trans. Control Syst. Technol. 2018, 26, 1685–1698. [Google Scholar]
  23. Bebek, O.; Canduran, K. Rebirth of Ceramic Art in The Digital Age: Transformation Journey from 3D Modeling to NFT. Art Vis. 2024, 30, 135–140. [Google Scholar]
  24. Liu, M.; Gu, Q.; Yang, B.; Yin, Z.; Liu, S.; Yin, L.; Zheng, W. Kinematics Model Optimization Algorithm for Six Degrees of Freedom Parallel Platform. Appl. Sci. 2023, 13, 3082. [Google Scholar] [CrossRef]
  25. Jiang, W.; Wu, D.; Dong, W.; Ding, J.; Ye, Z.; Zeng, P.; Gao, Y. Design and Validation of a Nonparasitic 2R1T Parallel Hand-Held Prostate Biopsy Robot with Remote Center of Motion. J. Mech. Robot. 2024, 16, 051009. [Google Scholar]
  26. Cao, Y.; Chen, X.; Zhang, M.; Huang, J. Adaptive Position Constrained Assist-as-Needed Control for Rehabilitation Robots. IEEE Trans. Ind. Electron. 2024, 71, 4059–4068. [Google Scholar]
  27. Li, C.; Zheng, P.; Li, S.; Pang, Y.; Lee, C.K.M. AR-assisted digital twin-enabled robot collaborative manufacturing system with human-in-the-loop. Robot. Comput.-Integr. Manuf. 2022, 76, 102321. [Google Scholar]
  28. Corke, P.I. A Simple and Systematic Approach to Assigning Denavit–Hartenberg Parameters. IEEE Trans. Robot. 2007, 23, 590–594. [Google Scholar]
  29. Rocha, C.R.; Tonetto, C.P.; Dias, A. A Comparison between the Denavit-Hartenberg and the Screw-based Methods Used in Kinematic Modeling of Robot Manipulators. Robot. Comput.-Integr. Manuf. 2011, 27, 723–728. [Google Scholar]
  30. Lengiewicz, J.; Kursa, M.; Holobut, P. Modular-robotic Structures for Scalable Collective Actuation. Robotica 2017, 35, 787–808. [Google Scholar]
  31. Li, C.H.G.; Chang, Y.M. Automated Visual Positioning and Precision Placement of a Workpiece Using Deep Learning. Int. J. Adv. Manuf. Technol. 2019, 104, 4527–4538. [Google Scholar]
  32. Zhou, Q.; Li, X. Visual Positioning of Distant Wall-Climbing Robots Using Convolutional Neural Networks. J. Intell. Robot. Syst. 2020, 98, 603–613. [Google Scholar]
  33. Moriwaki, K. Adaptive Exposure Image Input System for Obtaining High-Quality Color Information. Syst. Comput. Jpn. 1994, 25, 51–60. [Google Scholar]
  34. Zhou, S.; Wu, J.; Lu, Q. Spatial Gating with Hybrid Receptive Field for Robot Visual Localization. Int. J. Comput. Intell. Syst. 2024, 17, 131. [Google Scholar]
  35. Singh, A.; Raj, K.; Kumar, T.; Verma, S.; Roy, A.M. Deep Learning-Based Cost-Effective and Responsive Robot for Autism Treatment. Drones 2023, 7, 81. [Google Scholar] [CrossRef]
  36. Wang, G.; Li, Z.; Weng, G.; Chen, Y. An overview of industrial image segmentation using deep learning models. Intell. Robot. 2025, 5, 143–180. [Google Scholar]
  37. Chen, Y.; Cui, Z.; Wang, X.; Wang, D.; Wang, Z.; Song, Z.; Liu, D. Combining Closed-form and Numerical Solutions for the Inverse Kinematics of Six-degrees-of-freedom Collaborative Handling Robot. Int. J. Adv. Robot. Syst. 2024, 21, 17298806241228372. [Google Scholar]
  38. Xie, S.; Sun, L.; Wang, Z.; Chen, G. A Speedup Method for Solving the Inverse Kinematics Problem of Robotic Manipulators. Int. J. Adv. Robot. Syst. 2022, 19, 17298806221104602. [Google Scholar]
  39. Chen, Y.; Chu, B.; Freeman, C.T. A coordinate descent approach to optimal tracking time allocation in point-to-point ILC. Mechatronics 2019, 59, 25–34. [Google Scholar]
  40. Gu, Y.; Wu, J.; Liu, C. Error Analysis and Accuracy Evaluation Method for Coordinate Measurement in Transformed Coordinate System. Measurement 2025, 242, 115860. [Google Scholar]
  41. Gao, H.; An, H.; Lin, W.; Yu, X.; Qiu, J. Trajectory Tracking of Variable Centroid Objects Based on Fusion of Vision and Force Perception. IEEE Trans. Cybern. 2023, 53, 7957–7965. [Google Scholar]
  42. Liu, Z.; Li, Z.; Ran, J. Robust Solution for Coordinate Transformation Based on Coordinate Component Weighting. J. Surv. Eng. 2023, 149, 04023006. [Google Scholar]
  43. Chen, Y.; Wu, L.; Wang, G.; He, H.; Weng, G.; Chen, H. An active contour model for image segmentation using morphology and nonlinear Poisson’s equation. Optik 2023, 287, 170997. [Google Scholar] [CrossRef]
Figure 1. Gesture of clapping action in making purple clay teapot.
Figure 1. Gesture of clapping action in making purple clay teapot.
Symmetry 17 00532 g001
Figure 2. Schematic of the D-H model of the mechanical arm (UR5).
Figure 2. Schematic of the D-H model of the mechanical arm (UR5).
Symmetry 17 00532 g002
Figure 3. Designed mechanics of the pottery robot body.
Figure 3. Designed mechanics of the pottery robot body.
Symmetry 17 00532 g003
Figure 4. The Robotiq electric gripper is used as the end-effector of the robot’s arms and can be used with other tools to achieve ceramic works.
Figure 4. The Robotiq electric gripper is used as the end-effector of the robot’s arms and can be used with other tools to achieve ceramic works.
Symmetry 17 00532 g004
Figure 5. Overall structure (3D modeling) of the pottery robot.
Figure 5. Overall structure (3D modeling) of the pottery robot.
Symmetry 17 00532 g005
Figure 6. 3D camera calibration of both cameras shares common 3D physical characteristics.
Figure 6. 3D camera calibration of both cameras shares common 3D physical characteristics.
Symmetry 17 00532 g006
Figure 7. Inverse kinematics of a planar 2DoF open chain.
Figure 7. Inverse kinematics of a planar 2DoF open chain.
Symmetry 17 00532 g007
Figure 8. Coordinate systems for Cartesian trajectory planning.
Figure 8. Coordinate systems for Cartesian trajectory planning.
Symmetry 17 00532 g008
Figure 9. A 27 × 27 checkerboard grid.
Figure 9. A 27 × 27 checkerboard grid.
Symmetry 17 00532 g009
Figure 10. Flowchart of visual localization based on two monocular cameras.
Figure 10. Flowchart of visual localization based on two monocular cameras.
Symmetry 17 00532 g010
Figure 11. Parallax elimination analysis of a 27 × 27 checkerboard grid captured by Realsense D435i.
Figure 11. Parallax elimination analysis of a 27 × 27 checkerboard grid captured by Realsense D435i.
Symmetry 17 00532 g011
Figure 12. Camera calibration test results.
Figure 12. Camera calibration test results.
Symmetry 17 00532 g012
Figure 13. The workspace of the pottery robot with two collaborative arms.
Figure 13. The workspace of the pottery robot with two collaborative arms.
Symmetry 17 00532 g013
Figure 14. Joint space adaptive overload.
Figure 14. Joint space adaptive overload.
Symmetry 17 00532 g014
Figure 15. Path planning simulation of tapping action of pottery robot.
Figure 15. Path planning simulation of tapping action of pottery robot.
Symmetry 17 00532 g015
Figure 16. The velocity ( c m / s ) and acceleration ( c m / s 2 ) of the right arm during clapping movements.
Figure 16. The velocity ( c m / s ) and acceleration ( c m / s 2 ) of the right arm during clapping movements.
Symmetry 17 00532 g016
Figure 17. The trajectory of the end-effector of the right arm.
Figure 17. The trajectory of the end-effector of the right arm.
Symmetry 17 00532 g017
Table 1. Standard D-H model parameters of UR5.
Table 1. Standard D-H model parameters of UR5.
KinematicsLink Twist α/radLink Length a/mmLink Offset d/mmJoint Angle θ
Joint 1 π / 2 089.159 θ 1
Joint 20−4250 θ 2
Joint 30−392.250 θ 3
Joint 4 π / 2 0109.15 θ 4
Joint 5 π / 2 094.65 θ 5
Joint 60082.3 θ 6
Table 2. UR5 mechanical arm parameters.
Table 2. UR5 mechanical arm parameters.
ContentParameter
1Arm Length850 mm
2Weight18.4 kg
3Maximum Payload5 kg
4Repeatability of Joint Position0.03 mm
5Joint Speed180°/s
6Rotation range±360°
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, W.; Feng, X.; Cao, L.; Wang, T.; Wang, G.; Chen, Y. Visual Localization and Path Planning for a Dual-Arm Collaborative Pottery Robot. Symmetry 2025, 17, 532. https://doi.org/10.3390/sym17040532

AMA Style

Zhang W, Feng X, Cao L, Wang T, Wang G, Chen Y. Visual Localization and Path Planning for a Dual-Arm Collaborative Pottery Robot. Symmetry. 2025; 17(4):532. https://doi.org/10.3390/sym17040532

Chicago/Turabian Style

Zhang, Wei, Xirui Feng, Liangyu Cao, Tuo Wang, Guina Wang, and Yiyang Chen. 2025. "Visual Localization and Path Planning for a Dual-Arm Collaborative Pottery Robot" Symmetry 17, no. 4: 532. https://doi.org/10.3390/sym17040532

APA Style

Zhang, W., Feng, X., Cao, L., Wang, T., Wang, G., & Chen, Y. (2025). Visual Localization and Path Planning for a Dual-Arm Collaborative Pottery Robot. Symmetry, 17(4), 532. https://doi.org/10.3390/sym17040532

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop