Next Article in Journal
Study on the Driver/Steering Wheel Interaction in Emergency Situations
Next Article in Special Issue
Mobile Robot Path Optimization Technique Based on Reinforcement Learning Algorithm in Warehouse Environment
Previous Article in Journal
Dynamic Characteristics of a Gear System with Double-Teeth Spalling Fault and Its Fault Feature Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Study on Comprehensive Calibration and Image Sieving for Coal-Gangue Separation Parallel Robot

1
School of Mechanical Electronic and Information Engineering, China University of Mining and Technology (Beijing), Beijing 100083, China
2
Institute of Smart Mines and Robotics, China University of Mining and Technology (Beijing), Beijing 100083, China
3
Department of Mechanical Engineering, Tsinghua University, Beijing 100084, China
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2020, 10(20), 7059; https://doi.org/10.3390/app10207059
Submission received: 2 September 2020 / Revised: 3 October 2020 / Accepted: 6 October 2020 / Published: 11 October 2020
(This article belongs to the Special Issue Application of Computer Science in Mobile Robots)

Abstract

:
Online sorting robots based on image recognition are key pieces of equipment for the intelligent washing of coal mines. In this paper, a Delta-type, coal gangue sorting, parallel robot is designed to automatically identify and sort scattered coal and gangue on conveyor belts by configuring the image recognition system. Robot calibration technology can reduce the influence of installation error on system accuracy and provides the basis for the robot to accurately track and grab gangue. Due to the fact that the angle deflection error between the conveyor belt coordinate system and the robot coordinate system is not considered in the traditional conveyor belt calibration method, an improved comprehensive calibration method is put forward in this paper. Firstly, the working principle and image recognition and positioning process of the Delta coal gangue sorting robot are introduced. The scale factor parameter Factorc of the conveyor encoder is adopted to characterize the relationship between the moving distance of the conveyor and the encoder. The conveyor belt calibration experiment is described in detail. The transformation matrix between the camera, the conveyor belt, and the robot are obtained after establishment of the three respective coordinate systems. The experimental results show that the maximum cumulative deviation of traditional calibration method is 13.841 mm and the comprehensive calibration method is 3.839 mm. The main innovation of the comprehensive calibration is such that the accurate position of each coordinate in the robot coordinate system can be determined. This comprehensive calibration method is simple and feasible, and can effectively improve system calibration accuracy and reduce robot installation error on the grasping accuracy. Moreover, a calculation method to eliminate duplicate images is put forward, with the frame rate of the vision system set at seven frames per second to avoid image repetition acquisition and missing images. The experimental results show that this calculation method effectively improves the processing efficiency of the recognition system, thereby meeting the demands of the grab precision of coal gangue separation engineering. The goal revolving around “safety with few people and safety with none” can therefore be achieved in coal gangue sorting using robots.

1. Introduction

Coal mine intellectualization is the core technical support used to achieve high-quality development of the coal industry [1,2,3], with the intelligent coal washing system being one of the top ten constructed intelligent systems used in intelligent coal mines [4,5]. Made in China 2025—Energy Equipment Implementation Plan (Fa Gai Energy [2016] No. 1274) clearly requires the development of intelligent washing equipment, focusing on the development of 10 million tons/year of modular intelligent washing equipment and an intelligent control system. Based on information sensing, artificial intelligence, video monitoring, robots, and other technologies, the automatic and less humanized control of the coal washing process represents the developmental direction of intelligent washing systems. In this paper, a comprehensive calibration method is proposed to avoid the influence of robot installation error on grasping accuracy, and a duplicate image elimination calculation method is proposed to solve the problem of duplicate collection and missed shooting of gangue images. Coal gangue washing can be divided into manual and mechanical separation. With the rapid development of modern robot technology, using robots with vision systems to complete highly repetitive sorting work instead of manual work is becoming an increasingly prevalent trend [6,7]. The traditional coal gangue automatic separation method based on image recognition or ray identification mainly adopts high-pressure air gun jet impact, which has low efficiency and high noise. References [8,9,10] studied the coal gangue automatic sorting system and sorting robot with different structural forms, and references [11,12,13,14,15] studied the coal gangue image recognition technology. These research results promoted the construction process of intelligent coal mine washing, and also proved that coal gangue sorting robots can automatically identify and sort coal gangue, thereby greatly improving the quality and efficiency of coal gangue sorting and greatly benefiting both the economy and society. Delta parallel robots are widely used in packaging and sorting, assembly and painting, transportation, and palletizing due to its advantages of high speed, strong bearing capacity, and small cumulative error. Parallel robots with this structure can be used in the field of coal gangue online identification and sorting through the configuration of visual recognition systems.
The integrated calibration between the vision system, the robot, and the conveyor belt is the basis for robots achieving high-precision grasping control. Due to the influence of uncontrollable factors such as machining error and installation error, errors are present in the ideal structural parameters, such as hinge position and rod length, compared with the actual structural parameters, thereby making the kinematic model of the parallel robot inaccurate and affecting the accuracy of the parallel robot [16,17]. References [18,19] studied the accuracy calibration of Delta parallel robots, whereas reference [20] presented a two-stage calibration method for the parallel robot Delta 4. Reference [21] focused on the problem of the initial pose estimation by means of proprioceptive sensors (self-calibration) of suspended, underactuated, cable-driven, parallel robots. References [22,23,24,25] studied the calibration technology of robot kinematics, and references [26,27,28,29,30] studied robot vision calibration technologies. The above methods are suitable for high-precision positioning occasions, but are not suitable for establishing accurate kinematic models, and the calibration procedures are complex. On this basis, the traditional conveyor belt and vision calibration method of coal gangue sorting parallel robot can be improved; a comprehensive calibration method is therefore proposed to avoid the influence of robot installation error on grasping accuracy. The main innovations of this comprehensive calibration are as follows. Using this conveyor belt calibration method, the accurate positions of each coordinate in the robot coordinate system can be determined so that the end of the robot arm can accurately grasp objects on the conveyor belt. Because the conveyor belt coordinate system OcXcYcZc is established in the process of conveyor belt movement, the Xc direction, Yc direction, and Zc direction of the established conveyor belt coordinate system are parallel to the conveyor belt movement direction. This calibration method avoids the angle deviation introduced in the installation of the conveyor belt and the robot, that is, it does not need to consider that the moving direction of the conveyor belt is parallel to the X-axis direction of the robot coordinate system, the plane of the conveyor belt is perpendicular to the Z-axis direction of the robot coordinate system, and the slight angle deviation between them.
Finally, the calibration experiment regarding the conveyor belt and the vision system is completed, and the feasibility of the calibration method is verified. Further, regarding the problem of repeated image pickup in the process of coal gangue image recognition, a calculation method to eliminate repeated images is proposed, thereby effectively solving the problems of repeated collection and missed shooting of gangue images and improving the processing efficiency of the recognition system.

2. Overall Scheme of the Coal Gangue Sorting Robot

In this paper, a Delta-type, coal gangue sorting, parallel robot is designed to automatically identify and sort the scattered coal and gangue on the conveyor belt by configuring the image recognition system. Coal and gangue belong to two-dimensional plane object recognition on the conveyor belt, so it is necessary to carry an eye-to-hand industrial camera to track and provide feedback to the automatically identified gangue position through the conveyor belt encoder. The overall structure of coal gangue online sorting parallel robot is shown in Figure 1. The industrial camera is installed at the input end of the coal gangue mixed conveyor belt, with the camera collecting coal gangue images outside the Delta parallel robot workspace and transmitting the image information to the gangue image recognition software for recognition and analysis. If the recognition result is gangue, the gangue position information is transmitted through the Tool Centre Position (TCP) network protocol via the information database of the Kemotion controller. The robot control system completes the tracking of the target gangue by transmitting the gangue position information and the feedback encoder numerical information and grabs the gangue using the pneumatic gripper. When the gangue is captured successfully, the Kemotion controller system automatically deletes the relevant gangue position data in the information database and the system enters the next working cycle.
The robot completes the recognition and positioning function of coal gangue through the control system and transmits the gangue position information obtained by visual processing to the image position information database via the TCP protocol, converting it into the coordinate position under the robot coordinate system combined with the conveyor belt calibration. At this time, the robot control system processes the data, carries out trajectory planning, and completes the tracking and grasping actions. Figure 2 shows the design process of the coal gangue image recognition and positioning software.

3. Comprehensive Calibration of the Coal Gangue Sorting System

After the gangue position coordinates are determined, the sorting robot needs to carry out comprehensive calibration of the system in order to accurately complete the grasping work and convert the coordinate position information recognized by the vision system to the position information under the robot coordinate system. The robot vision sorting system also requires system calibration, including robot body calibration, camera calibration, and hand–eye calibration. The integrated calibration between the vision system, the robot, and the conveyor belt is the basis for the robot to achieve high-precision grasping control. Because the ideal structural parameters of Delta robots usually exhibit some deviation compared with the actual structural parameters, the moving direction of the conveyor belt cannot be guaranteed to be parallel to the X-axis of the robot coordinate system and the plane of the conveyor belt is perpendicular to the Z-axis of robot coordinate system. There is therefore a small angle deviation between them, which affects the accuracy of the sorting robot. Directly improving the machining and installation accuracy of parts greatly increases the manufacturing cost. Therefore, this paper improves the traditional conveyor belt and visual calibration method, thereby reducing the impact of robot installation error on the grasping accuracy.

3.1. Calibration between the Conveyor Belt System and the Robot System

The coordinate system relationship between the systems is shown in Figure 3. The robot coordinate system established regarding the robot is R X R Y R Z R , and the camera coordinate system established for the industrial camera is V X V Y V Z V . In the field of view of the camera, the conveyor belt plane is used as the XY plane to establish the conveyor belt initial coordinate system C X C Y C Z C . The XC direction is consistent with the movement direction of the conveyor belt.
  • Transformation matrix and scale factor
The calibration of the conveyor belt calculates the pose of the conveyor belt relative to the robot coordinate system. The matrix H C R is used to represent the transformation relationship between the two coordinate systems. If the initial position of the conveyor belt measured by the camera is P C , then the attitude of this point in the robot coordinate system is:
P R = H C R P C
Since the conveyor belt coordinate system changes dynamically along the direction of conveyor belt movement, the value of the change can be calculated using the change of the encoder. In this paper, a parameter of the conveyor belt is represented by the encoder scale factor Factorc (encoder factor). The proportional relationship between the reading change in the robot coordinates and the encoder reading change is the scale factor [31].
  • Calibration method of the conveyor belt
The calibration of the conveyor belt is completed on the robot Kemotion control system. The controller has a conveyor belt tracking function module and completes the relevant configuration work, shifts the visual coordinate system to the conveyor belt, and establishes a conveyor belt coordinate system Trackingbase. The whole calibration process uses the teaching device to display the robot end coordinate position point and the value provided by feedback by the encoder in real time. The calibration steps are as follows:
  • Place the object within the visual range, then click the “workpiece grab” button. At this time, check the encoder reading V e 0 using the teaching device, as shown in Figure 4a.
  • Start the conveyor belt, move the workpiece to the working area of the robot, pause the conveyor belt, manually jog the robot to the workpiece grasping position (i.e., point P1, as shown in Figure 4b) using the teaching device, and record the robot position P 1 R ( x 1 , y 1 , z 1 ) and encoder reading V e 1 .
  • Restart the conveyor belt, move the workpiece to run for a distance (the workpiece is still in the grasp range), pause the conveyor belt again, and move the robot to the position above the workpiece grasping point (i.e., point P2, as shown in Figure 4c). Record the robot position P 2 R ( x 2 , y 2 , z 2 ) and encoder position V e 2 .
  • Select another workpiece (workpiece 2) and place it at the diagonal point of the first workpiece in the visual range (in order to improve accuracy, there should be a maximum deviation from workpiece 1 in the Y direction). Then click “workpiece grab” and record the encoder reading V e 3 , as shown in Figure 4d.
  • Start the conveyor belt, move workpiece 2 to the middle of the robot’s workspace, pause the conveyor belt, move the robot inching function to point P3 of workpiece 2, and record the position of robot and encoder. Record the position P 3 R ( x 3 , y 3 , z 3 ) in the robot coordinate system and the encoder reading V e 3 , as shown in Figure 4e.
In summary, the coordinates of P 1 R , P 2 R , P 3 R , and O C R in the scope of robot grasping are marked, and the relationship is shown in Figure 5, where O C R represents the dynamic origin position of the conveyor belt.
According to the above data, the scale factor can be obtained by
Δ L R = ( x 2 x 1 ) 2 + ( y 2 y 1 ) 2 + ( z 2 z 1 ) 2
Δ L C = V e 2 V e 1
F a c t o r c = Δ L R Δ L C
It can be seen that if the encoder readings V e 1 and V e 2 at a starting distance and an end position within a certain distance of the conveyor belt movement are known, the movement distance of the target object in the direction of the conveyor movement can be obtained according to the scale factor.
Δ L = ( V e 2 V e 1 ) F a c t o r c
The origin of the conveyor belt coordinate system enters into the range of the robot after a distance of Δ L 1 . At this time, the coordinate of the origin of the conveyor belt coordinate system relative to the robot coordinate system is set as O C R . The moving distance is therefore represented by V e 3 , the encoder reading which calibrates the starting point position of block 2, and V e 3 , the encoder reading whereby calibration block 2 runs to position P3.
Δ L 1 = ( V e 3 V e 3 ) F a c t o r c
According to the coordinate vector diagram of conveyor belt, the following relationship can be established.
O C R : The coordinate of the origin of the moved conveyor belt coordinate system relative to the robot coordinate system. P i R (i = 1,2,3,4) represents the position of the workpiece in the robot coordinate system.
( ( P 3 R O C R ) ( P 2 R P 1 R ) = 0 | | ( P 1 R O C R ) × ( P 2 R P 1 R ) | | = 0 ( ( P 3 R O C R ) × ( P 2 R P 1 R ) ) ( P 2 R P 1 R ) = 0 )
The coordinates of the origin of the moved conveyor belt coordinate system relative to the robot coordinate system can be obtained by the above Equation.
O C R = ( x C 0 R , y C 0 R , z C 0 R ) T { x C 0 R = ( x 1 2 x 3 2 x 1 x 2 x 3 x 1 y 2 y 3 + x 1 y 1 y 3 + x 1 y 2 2 x 1 y 2 y 3 x 1 z 1 z 2 + x 1 z 1 z 3 + x 1 z 2 2 x 1 z 2 z 3 + x 2 2 x 3 + x 2 y 1 2 x 2 y 1 y 2 x 2 y 1 y 3 + x 2 y 2 y 3 + x 2 z 1 2 x 2 z 1 z 2 x 2 z 1 z 3 ) / ( x 1 x 2 ) 2 · ( y 1 y 2 ) 2 · ( z 1 z 2 ) 2 y C 0 R = ( x 1 2 y 2 x 1 x 2 y 1 x 1 x 2 y 2 + x 1 x 3 y 2 + x 1 y 2 2 x 2 x 3 y 1 + x 2 x 3 y 2 + y 1 2 y 3 2 y 1 y 2 y 3 y 1 z 1 z 2 + y 1 z 1 z 3 + y 2 z 1 2 y 1 z 2 z 3   + y 2 3 y 3   + y 2 z 1 2 y 2 z 1 z 2 y 2 z 1 z 3 + y 2 z 2 z 3 ) / ( x 1 x 2 ) 2 · ( y 1 y 2 ) 2 · ( z 1 z 2 ) 2 z C 0 R = ( x 1 2 z 2 x 1 x 2 z 1 x 1 x 2 z 2 + x 1 x 3 z 2 + x 1 x 3 z 2 + x 1 x 3 z 1 x 1 x 3 z 2 + x 2 2 z 1 x 2 x 3 z 1 + x 2 x 3 z 2 + y 1 2 z 2 y 1 y 2 z 1 y 1 y 2 z 2 + y 1 y 3 z 1 y 1 y 3 z 2 + y 2 1 z 1 y 2 y 3 z 2 + z 1 2 z 3 2 z 1 z 2 z 3 + z 2 2 z 3 ) / ( x 1 x 2 ) 2 · ( y 1 y 2 ) 2 · ( z 1 z 2 ) 2
Substituting the coordinates of P 1 R , P 2 R , P 3 R , we achieve the following:
O C R = ( 788.63 , 232.751 , 862.108 ) T
The expression of the basic coordinate system of the conveyor belt is obtained by
{ X C = ( ( P 2 R ) T O C R ) | | ( P 2 R ) T O C R | | Y C = ( ( P 3 R ) T O C R ) | | ( P 3 R ) T O C R | | Z C = X C × Y C
The O C R coordinate value is substituted into Equation (9) to obtain the following results:
X C = ( 0.1357 , 0.9908 , 0.0001 ) T Y C = ( 0.1345 , 0.0183 , 0.9908 ) T Z C = ( 0.9816 , 0.1345 , 0.1357 ) T
Figure 6 shows the original coordinate system of the conveyor belt, which is C’, the coordinate system C obtained by Δ L 1 after operation, and the robot coordinate system R. H C C represents the relationship matrix between the original conveyor belt coordinate system C’ and the coordinate system C obtained by running Δ L 1 . H C R represents the relationship matrix between the coordinate system C obtained by running Δ L 1 and the robot coordinate system.
The transformation matrix H C R between the dynamic conveyor coordinate system and the robot coordinate system is obtained via Equations (8) and (9).
H C R = [ X C Y C Z C O C R 0 0 0 1 ]
H C C = T r a n s X C , Δ L 1 = [ 1 0 0 Δ L 1 0 1 1 0 0 0 1 0 0 0 0 1 ]
The relationship matrix between the original coordinates of the conveyor belt and the robot coordinate system can be obtained by
H C R = H C R H C C = [ X C Y C Z C O C R 0 0 0 1 ] [ 1 0 0 Δ L 1 0 1 1 0 0 0 1 0 0 0 0 1 ]
The position of the target point in the robot coordinate system is P R and the initial point of the camera measurement point in the initial coordinate system of the conveyor belt is P C , therefore, the relationship between the two can be expressed by Equation (13).
P R = H C R T r a n s X C , Δ L P C
The transformation matrix is therefore
T r a n s X c , Δ L = [ 1 0 0 Δ L 0 1 0 0 0 0 1 0 0 0 0 1 ]
Through the calibration method, the transformation matrix between the conveyor coordinate system and the robot coordinate system can be obtained, and the influence of assembly error is reduced.

3.2. Calibration between the Camera System and the Conveyor Belt System

The purpose of belt calibration is to solve the problem of camera operation outside the robot’s operating range. Through the calibration experiment of the conveyor belt, the relationship expression between the conveyor coordinate system and the camera coordinate system can be obtained. In combination with H C R obtained from Equation (12), the relationship matrix between the camera coordinate system and the robot coordinate system is further obtained.
The camera of the system is outside the robot operation space, so the encoder value should be introduced in the calibration of camera external parameters. Firstly, the four target points are placed in the visual operation range. After the camera is positioned, the positions of the calibration points relative to the camera coordinate system P 1 V , P 2 V , P 3 V , P 4 V are calculated. The conveyor belt is started to make the conveyor belt move Δ L into the operation range of P 1 R , P 2 R , P 3 R , and   P 4 R , and the conversion relationship between the scale factor F a c t o r c and the two coordinate systems H C R alongside the conversion relationship between the camera coordinate system and the conveyor belt coordinate system is obtained as follows:
P i R = H C R T r a n s X C , Δ L ( H V C P i V ) , i = 1 , 2 , 3 , 4
From Equation (14), the transformation matrix expression between the camera coordinate system and the conveyor coordinate system is obtained as follows:
H V C = ( H C R ) 1 P i R ( T r a n s X C , Δ L ) 1 ( P i V ) 1 , i = 1 , 2 , 3 , 4

3.3. System Comprehensive Calibration Experiment

The traditional calibration method of conveyor belt does not consider the angle deflection error between the conveyor belt coordinate system and the robot coordinate system. In this paper, by improving the traditional conveyor belt calibration method, a more accurate transformation relationship matrix between the two coordinate systems is obtained and the influence of the two calibration methods on the system error is compared through experiments.
As shown in Figure 7, the camera is calibrated to get the internal parameters of the camera and the position of the read target point in the camera coordinate system P i V .
The end probe of the robot is moved to the initial point O to obtain the coordinate position O0 of the O point in the robot coordinate system. The conveyor belt is controlled to move an equal distance four times with each movement comprising 150 mm, which is the corresponding encoder variation.
Δ L C = Δ L R F a c t o r c
The position under the robot coordinate after each movement is measured as shown in Figure 8: O1 (−161.588, 199.710, −890.267), O2 (−14.956, 201.710, −890.468), O3 (−130.944, 204.840, −890.266), O4 (277.788, 207.460, −889.244), and O5 (423.932, 210.140, −891.460).
Using the above two calibration methods, the data of two groups of positions in the robot coordinate system after movement are obtained respectively, as shown in Table 1.
According to the data shown in Table 1, the maximum cumulative deviation of the calibration method in this paper is 3.839 mm in the specified operating space, which is far lower than the 13.841 mm obtained with the traditional calibration method. The deviation error caused by the calibration method in this paper is obviously smaller than that caused by the traditional calibration method. The system calibration accuracy is effectively improved, and the calibration method is simple and practical.

4. Screening of Coal Gangue Image

4.1. Principle of Image Screening and Recognition

The field of view of the camera taken by coal and gangue passing through the image acquisition area on the conveyor belt is shown in Figure 9. The gangue image in the green ellipse is incomplete, indicating distorted picture information. Therefore, it is necessary to design a calculation method to screen the coal gangue image so that the recognition system automatically skips the distortion area and only collects the complete image information in the red box.
In this experiment, coal and gangue within the size of 60–200 mm are identified. In order to ensure that fractions larger or smaller than the capacity of the robot do not get on the conveyor, the coal gangue sorting robot is equipped with a coal gangue queuing arrangement device (as shown by the label 2 in Figure 10) at the input end of the coal gangue conveying belt. As shown in the Figure 10, the coal is first conveyed to the vibration classification screen, as shown by the label 1, through the coal conveyor for screening and grading according to particle size. Then, the coal gangue mixture is transported to the conveyor belt, and the alignment of the coal gangue and the distance interval between them are controlled by the queuing mechanism, as seen in 2. In addition, the coal gangue can be simultaneously separated. The materials on the coal gangue conveyor belt are scattered, the shape of coal gangue is irregular, and it is not stacked up and down.
As shown in Figure 11, the integrity of the selected image information can be ensured by the selected regional centroid P [ r , r + N ] .
In Figure 11, r is the maximum radius of the coal or gangue sample in mm and L is the field distance in the direction of conveyor belt movement in mm.
Before calculating the regional eigenvalue, the centroid position is selected first, which can effectively avoid distortion of coal gangue image information and improve system recognition efficiency.
In practical engineering applications, when coal and gangue pass through the visual acquisition system, the frame per second (FPS) from the vision system depends on the field of view (FOV) and the velocity V C of the conveyor belt. In most cases, the frequency of photographing should not be too fast to reduce data processing burden of the image recognition system. Therefore, it is necessary to select the appropriate acquisition frequency to photograph every object on the conveyor belt one or two times per second when passing through the visual acquisition area to maintain the stability of the visual system. When it is photographed twice per second, the frame rate of the vision system is calculated as follows Equation (17):
F P S = V C F O V 2 r × 2 = 3.5 , frames / s
where the maximum running speed of the conveyor belt is V C = 0.5 m / s and the visual field size of the vision system in the moving direction of the conveyor belt is FOV = 0.7 m; therefore, when the frame rate of the vision system is set to seven frames per second, the missing images problem can be effectively avoided.
Different from the sensor trigger, the coal gangue recognition system uses the method involving taking pictures on time. Although this improves the recognition efficiency of the system, it introduces interference of repeated objects and increases the calculation pressure of the controller. As shown in Figure 8, four images are collected using the time-based image acquisition method. It can be seen from the figure that the same gangue, P1, appears several times in the four pictures. The image system processes the images collected each time. If the same gangue is identified in several acquisition, the system repeatedly sends the position information of the gangue to the robot. Therefore, a mechanism is needed to distinguish the same objects in the four images. If the collected position data is judged to be the existing object, the data are discarded.
Assuming that the coordinate information of the new data collected is P = ( x 0 , y 0 , z 0 , E 0 ) and E0 is the value of the conveyor belt encoder, at this time, Equation (18) can be used to determine whether the data are discarded.
| ( x i x 0 ) 2 + ( y i y 0 ) 2 + ( z i z 0 ) 2 | | ( E i E 0 ) F a c t o r c | < Δ
where E i represents the encoder value recorded last time, ( x i , y i , z i ) is the coordinate position information of each gangue identified in the previous picture, and Δ is the allowable error range set in advance.
In conclusion, by selecting the appropriate acquisition frame rate and using the repeat elimination calculation method, the problem of waste rock missing and repeated shooting is effectively solved, thereby improving the processing efficiency of the sorting system.

4.2. Experimental Verification

In this paper, the online identification and sorting experiment of coal gangue was carried out in the laboratory. The camera was Genie Nano M2590 NIR of DALSA, and its related parameters are shown in Table 2.
In this paper, the Computar M1614-MP lens was selected; its specific parameters are shown in Table 3.
Figure 12 shows the online recognition process of coal gangue identification software. Coal and gangue pass through the visual acquisition area in turn. Gangue is identified by gangue image recognition software, and the coordinate information of gangue is transmitted to the target information database via the TCP protocol. Figure 13 shows the coal gangue grabbing experiment. The parallel robot controls pneumatic hand grasping to allow tracking and gangue grasping. The experimental results show that the Delta coal gangue sorting parallel robot has good coal gangue recognition, thereby meeting the requirements of engineering site sorting speed and accuracy.

5. Conclusions

(1) A Delta parallel coal gangue online sorting robot based on image recognition is proposed, its working principle is explained, and the design process of image recognition and positioning function is introduced. The structured robot can be used for online automatic recognition and sorting of coal gangue, achieving the goal of “safety with few people and safety with none”.
(2) An improved comprehensive calibration method is proposed. By solving the transformation matrix between the camera coordinate system and the conveyor belt coordinate system and between the conveyor belt coordinate system and the robot coordinate system, the influence of robot installation error on grasping accuracy is effectively avoided. The experimental results show that the integrated calibration method is simple, significantly improves robot grasping accuracy, and meets the actual needs of coal gangue sorting.
(3) Regarding the problem of repeated image acquisition in coal gangue recognition system, the problems of missed shot and repeated image acquisition are effectively solved by selecting the appropriate acquisition frame rate and repeated elimination calculation method. The experimental results show that this method effectively improves the sorting system efficiency.
(4) The application of the parallel robot in the field of coal gangue sorting is a preliminary exploration, and this experiment proves that it is feasible and effective. With the continuous advancement of intelligent construction in coal mines, there are still many technical bottlenecks in some common key technologies of coal mine robots, especially in special coal mine robots that can operate independently. Challenges of this study and possibilities of improvement exist with respect to different technological aspects, such as fuzzy logic mechanisms or using 3D cameras, etc. Therefore, strengthening the basic theoretical research and innovation of intelligent equipment is also an inevitable requirement for the coal industry to realize safe, efficient, intelligent, and green production.

Author Contributions

D.S. was responsible for the data analysis and manuscript writing. Y.W. was responsible for the data mining from the literature. Z.Y. proposed the idea and was responsible for the literature search. J.W. and Y.L. were responsible for the article editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Fundamental Research Funds for the Central Universities (No: 2019YJ02).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Wang, G.F.; Liu, F.; Meng, X.G. Research and practice on intelligent coal mine construction (primary stage). Coal Sci. Technol. 2019, 47, 1–36. [Google Scholar]
  2. Wang, G.F.; Liu, F.; Pang, Y.H. Coal mine intellectualization: The core technology of high quality development. J. China Coal Soc. 2019, 44, 349–357. [Google Scholar] [CrossRef]
  3. Liu, F.; Cao, W.J.; Zhang, J.M. Continuously promoting the coal mine intellectualization and the high-quality development of China’s coal industry. China Coal 2019, 45, 32–37. [Google Scholar]
  4. Wang, G.F.; Du, Y.B. Development direction of intelligent coal mine and intelligent mining technology. Coal Sci. Technol. 2019, 47, 1–10. [Google Scholar]
  5. Wang, G.F.; Pang, Y.H.; Ren, H.W. Intelligent coal mining pattern and technological path. J. Min. Strat. Control Eng. 2020, 2, 5–19. [Google Scholar]
  6. Ge, S.R. Present situation and development direction of coal mine robots. China Coal 2019, 45, 18–27. [Google Scholar]
  7. Ge, S.R.; Hu, E.Y.; Pri, W.L. Classification system and key technology of coal mine robot. J. China Coal Soc. 2020, 45, 455–463. [Google Scholar]
  8. Ma, X.M. Study of on-line recognition and automatic separation of waste rock in coal mine. J. Xi’An Univ. Sci. Technol. 2003, 23, 66–68. [Google Scholar]
  9. Cao, X.G.; Fei, J.H.; Wang, P. Study on coal-gangue sorting method based on multi-manipulator collaboration. Coal Sci. Technol. 2019, 47, 7–12. [Google Scholar]
  10. Yuan, H.X. Research on Coal and Gangue Smart Sorting Control System Based on X-ray Image. Master’s Thesis, Northeastern University, Shenyang, China, 1 June 2014. [Google Scholar]
  11. Liu, F.Q.; Qiang, J.S.; Wang, X.H.; Song, J.L. Automatic separation of waste rock in coal mine based on image procession and recognition. J. China Coal Soc. 2000, 25, 534–537. [Google Scholar]
  12. Tan, C.C. The research of coal gangue identification and separation technique based on image processing technology. Master’s Thesis, Taiyuan University of Technology, Taiyuan, China, 1 June 2017. [Google Scholar]
  13. Tripathy, D.P.; Reddy, K.G.R. Multispectral and joint colour-texture feature extraction for ore-gangue separation. Pattern Recognit. Image Anal. 2017, 27, 338–348. [Google Scholar] [CrossRef]
  14. Li, M.; Duan, Y.; Cao, X.G. Image identification method and system for coal and gangue sorting robot. J. China Coal Soc. 2020, 1–9. [Google Scholar] [CrossRef]
  15. Wang, J.C.; Li, L.H.; Yang, S.L. Experimental study on gray and texture features extraction of coal and gangue image under different illuminance. J. China Coal Soc. 2018, 43, 3051–3061. [Google Scholar]
  16. Shang, D.Y.; Li, Y.; Liu, Y. Research on the motion error analysis and compensation strategy of the delta robot. Mathematics 2019, 7, 411. [Google Scholar] [CrossRef] [Green Version]
  17. Peng, B.B.; Gao, F. Modeling for calibration of parallel robot. Chin. J. Mech. Eng. 2005, 41, 132–135. [Google Scholar] [CrossRef]
  18. Tang, G.B.; Huang, T. Kinematic calibration of delta robot. Chin. J. Mech. Eng. 2003, 8, 55–60. [Google Scholar] [CrossRef]
  19. Vischer, P.; Clavel, R. Kinematic calibration of the parallel delta robot. Robotica 1998, 16, 207–218. [Google Scholar] [CrossRef]
  20. Maurine, P.; Dombre, E. Calibration procedure for the parallel robot Delta 4. In Proceedings of the IEEE International Conference on Robotics and Automation, Minneapolis, MN, USA, 22–28 April 1996; pp. 975–980. [Google Scholar]
  21. Idá, E.; Merlet, J.P.; Carricato, M. Automatic self-calibration of suspended under-actuated cable-driven parallel robot using incremental measurements. Mech. Mach. Sci. 2019, 74, 333–344. [Google Scholar]
  22. Li, Z.X.; Huang, T.; Zhao, X.M. Kinematical calibration method for high-speed parallel manipulator. J. Mach. Des. 2005, 3, 18–20. [Google Scholar]
  23. Zhang, W.C.; Mei, J.P.; Liu, Y. Calibration of delta parallel robot kinematic errors based on laser tracker. J. Tianjin Univ. 2013, 46, 257–262. [Google Scholar]
  24. Fang, L.J.; Dang, P.F. Kinematic calibration method of robots based on quantum-behaved particle swarm optimization. Chin. J. Mech. Eng. 2016, 52, 23–30. [Google Scholar] [CrossRef]
  25. Li, J.H. Research on kinematic calibration method of parallel robot. Mech. Sci. Technol. Aerosp. Eng. 2019, 38, 472–479. [Google Scholar]
  26. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
  27. Tsai, R.Y. An efficient and accurate camera calibration technique for 3D machine Vision. In Proceedings of the IEEE Conference on Computer Vision & Pattern Recognition, Miami Beach, FL, USA, 22–26 June 1986; pp. 364–374. [Google Scholar]
  28. Kang, C.F.; Zheng, Y.K.; Gao, Y.Y. Study of fast calibration method on parallel robot visual fetching system. Mach. Tool Hydraul. 2018, 46, 16–20. [Google Scholar]
  29. Gao, P.; Ke, L.H. Method of calibration cased on delta Robot conveyor and vision. Microcontrollers Embed. Syst. 2018, 18, 40–43. [Google Scholar]
  30. Dong, X.M.; Li, Z.B. Application of the computer vision in kinematic calibration of parallel robot. Process. Autom. Instrum. 2016, 37, 16–19. [Google Scholar]
  31. Yang, Q.; Liu, G.F. Method of integrated calibration on DELTA robot conveyor and vision. Mech. Electr. Eng. Technol. 2015, 1, 5–10. [Google Scholar]
Figure 1. Robotic coal sorting system.
Figure 1. Robotic coal sorting system.
Applsci 10 07059 g001
Figure 2. Flowchart of software design for the coal gangue identification and positioning system.
Figure 2. Flowchart of software design for the coal gangue identification and positioning system.
Applsci 10 07059 g002
Figure 3. System coordinate system.
Figure 3. System coordinate system.
Applsci 10 07059 g003
Figure 4. Calibration steps.
Figure 4. Calibration steps.
Applsci 10 07059 g004aApplsci 10 07059 g004b
Figure 5. Conveyor belt coordinates.
Figure 5. Conveyor belt coordinates.
Applsci 10 07059 g005
Figure 6. Transformation of the coordinate system.
Figure 6. Transformation of the coordinate system.
Applsci 10 07059 g006
Figure 7. Target point.
Figure 7. Target point.
Applsci 10 07059 g007
Figure 8. Conveyor belt calibration experiment.
Figure 8. Conveyor belt calibration experiment.
Applsci 10 07059 g008
Figure 9. Camera field of view.
Figure 9. Camera field of view.
Applsci 10 07059 g009
Figure 10. Coal gangue separation device.
Figure 10. Coal gangue separation device.
Applsci 10 07059 g010
Figure 11. Distance-based screening method.
Figure 11. Distance-based screening method.
Applsci 10 07059 g011
Figure 12. Online recognition experiment.
Figure 12. Online recognition experiment.
Applsci 10 07059 g012
Figure 13. Coal gangue sorting experiment.
Figure 13. Coal gangue sorting experiment.
Applsci 10 07059 g013
Table 1. Data and error of two calibration methods.
Table 1. Data and error of two calibration methods.
Serial NumberActual CoordinatesCoordinates Obtained by the Calibration Method in this WorkCoordinates Obtained by Traditional Calibration MethodComprehensive Calibration ErrorTraditional Error
xyzxyzxyz
1−161.588199.710−890.267−161.588199.710−890.267−161.588199.710−890.26700
2−14.956201.710−890.468−13.738202.960−890.267−11.588199.710−890.2671.7573.9222
3130.944204.840−890.266133.962205.640−890.267138.412199.710−890.2673.1229.0602
4277.788207.460−889.244280.812208.434−890.267287.230199.710−890.2673.33812.258
5433.932210.140−891.460434.392213.760−890.267441.952199.710890.2673.83913.841
Table 2. Parameters of the camera.
Table 2. Parameters of the camera.
ProjectParameters
Camera modelGenie Nano M2590 NIR
ColorGray camera
Chip size2/3″
pixel2592 × 2048
Pixel size (um)4.8
Sensorcharge coupled device
Frame rate (fps)22.7
Interface modeC/CS
Table 3. Parameters of the camera lens.
Table 3. Parameters of the camera lens.
ProjectParameters
Lens modelComputar M1614-MP
focal length8 mm
Target surface size2/3″
Maximum imaging size8.8 mm × 6.6 mm
ApertureF1.4-F16C
Working distance0.2–
Interface modeC Interface

Share and Cite

MDPI and ACS Style

Shang, D.; Wang, Y.; Yang, Z.; Wang, J.; Liu, Y. Study on Comprehensive Calibration and Image Sieving for Coal-Gangue Separation Parallel Robot. Appl. Sci. 2020, 10, 7059. https://doi.org/10.3390/app10207059

AMA Style

Shang D, Wang Y, Yang Z, Wang J, Liu Y. Study on Comprehensive Calibration and Image Sieving for Coal-Gangue Separation Parallel Robot. Applied Sciences. 2020; 10(20):7059. https://doi.org/10.3390/app10207059

Chicago/Turabian Style

Shang, Deyong, Yuwei Wang, Zhiyuan Yang, Junjie Wang, and Yue Liu. 2020. "Study on Comprehensive Calibration and Image Sieving for Coal-Gangue Separation Parallel Robot" Applied Sciences 10, no. 20: 7059. https://doi.org/10.3390/app10207059

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop