Next Article in Journal
Model Updating Using Frequency Response Functions Based on Sherman–Morrison Formula
Previous Article in Journal
An Analysis on the Effects of the Fuel Injection Rate Shape of the Diesel Spray Mixing Process Using a Numerical Simulation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection

1
Faculty of Mechanical Engineering, Wroclaw University of Science and Technology, Wybrzeże Wyspiańskiego 27, 50-370 Wroclaw, Poland
2
Faculty of Geoengineering, Mining and Geology, Wroclaw University of Science and Technology, Na Grobli 15, 50-421 Wroclaw, Poland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2020, 10(14), 4984; https://doi.org/10.3390/app10144984
Submission received: 26 May 2020 / Revised: 14 July 2020 / Accepted: 15 July 2020 / Published: 20 July 2020
(This article belongs to the Section Mechanical Engineering)

Abstract

:
It is well known that mechanical systems require supervision and maintenance procedures. There are a lot of condition monitoring techniques that are commonly used, and in the era of IoT and predictive maintenance one may find plenty of solutions for various applications. Unfortunately in the case of belt conveyors used in underground mining a list of possible solutions shrinks quickly. The reason is that they are specific mechanical systems—the typical conveyor is located in the mining tunnel and its length may vary between 100 and 1000 m. According to mining regulations, visual inspection of the conveyor route should be done before it will start the operation. On the other hand, since environmental conditions in mining tunnels are extremely harsh and the risk of accidents is high, there is a tendency to minimize human presence in the tunnels. In this paper, we propose a prototype of an inspection robot based on a UGV platform that could support maintenance staff during the inspection. At present, the robot is controlled by an operator using radio however, we plan to make it autonomous. Moreover, its support could be significant—the robot can “see” elements of the conveyor route (RGB camera) and can identify hot spots using infrared thermography. Moreover, the detected hot spots could be localized and its position can be stored together with both types of images. In parallel, it is possible to preview images in a real-time and stored data allow analysing state of conveyor system after the inspection mission. It is also important that due to radio control systems, an operator can stay in a safe place. Such a robot can be classified as a mobile monitoring system for spatially distributed underground infrastructure.

1. Introduction

The demand for raw materials has increased significantly in recent years. Easily accessible deposits have been exhausted, so we need to reach deeper and deeper to get the treasures of the Earth. Mining is usually considered to be risky and dangerous and miners arouse respect and fear. However, current effective mining is a matter of advanced technologies rather than magic. Increased needs, harsh conditions in deep mines, global competition related to production cost- all these factors lead to the conclusion that advanced technologies that must be used for modern mining. The Mine Magazine recently defined 10 emerging technologies that may change the mining industry. Among others, robotics, predictive maintenance, remote operating and monitoring centers and advanced analytics has been selected as future of mining [1]. Deep underground mining is a general term; however, every single mine is very different and depends on geology, raw material type, the geometry of deposit, used technology, size of the mine, used machines, natural hazard conditions (poisonous gasses, rockburst tendencies), etc. Therefore, each advanced solution should be tuned to the considered use case. In a deep underground copper ore mine considered here, the majority of mentioned technologies have been, or currently are implemented. One of the serious challenges is to maintain a spatially distributed belt conveyor-based ore transport network over a very large area.
A belt conveyor system is used for bulk material transportation for large distances. A system considered here consists of dozen belt conveyors driven by 1–4 drive units, see Figure 1. A transportation system structure looks like a branch of a tree, and the main transportation line consists of conveyors connected in series, see Figure 2. A failure of any of them might stop material transport in the mine and massive production losses. Reliable operation of the belt conveyor system is critically important to keep the production volume on the assumed level. Therefore, each drive unit (see Figure 1) used in conveyor is equipped with electric current and temperature monitoring systems to supervise their operation. Acquired SCADA data is monitored and processed in Data Monitoring and Analysis Center that has been created in the mine (called One Room Control), see Figure 3. In case of any suspicious situation, the maintenance team is asked to do a detailed inspection and advanced diagnostics including vibration measurement, see Figure 4.
While monitoring of drive units is almost solved thanks to SCADA or occasional detailed inspections, unfortunately supervision of the elements of the conveyor route is still a challenge. It should be done regularly, as various components may be damaged or simply may require some corrections. Thanks to the Internet of Things (IoT) concept, some authors proposed solutions based on distributed smart sensors network, but taking into account the number of idlers to be monitored, slow development of wireless transmission system in underground mine, reliability of such solutions, etc., there is a need to search for more realistic/practical scenarios [2].
One of the most critical issues is friction between stationary elements and moving belt, as well as increased rolling resistance of idlers supporting the belts. Research on idlers is an important topic in belt conveyor exploitation and maintenance [3,4]. Damaged idlers may be a source of increased energy consumption or even initiate belt damage or fire. Therefore, to avoid catastrophic failures, a visual inspection is performed by miners along the conveyor route Figure 5. Note, (see Figure 2 that there are tens of kilometers of conveyors and they should be checked from both sides. It requires human resources, takes time and it is pretty risky for people. The idea of using mobile robots for inspection is not a new one. There are some promising papers on that subject [5,6,7,8,9]. Several inspection scenarios have been defined under the umbrella of THING project [10]. Interesting examples of using inspection robots in underground mine is mapping or exploration [11,12,13,14,15,16], supporting of rescue action [17,18,19] or other “mine disasters” [20].
In this paper, we propose a UGV (Unmanned Ground Vehicle) platform as a virtual miner that will do the same work as inspection staff, namely it will ride along the belt conveyor and collect information about conveyor operation using sensors. In Section 2, we will provide a brief description of the robotic platform. Section 3 contains the use case description—an experiment in our laboratory where the belt conveyor test rig is located. We also provide a plan and results of our experiments. Section 4 contains the methods and results of inspection data analysis. Finally, sections Discussions and Conclusions are provided.

2. A UGV Platform

A developed mobile robot belongs to the UGV (Unmanned Ground Vehicle) class of systems. The frame has been assembled using light aluminum construction rails. Drive block has been installed in the front part, and the back contains the cargo space connected with the front using the two-axial joint (Figure 6). It allows for all of the wheels to make contact with the ground all the time, amid the stiff suspension. Additionally, such a solution has good stability (Figure 7).
The wheels are driven by 24V 250W DC motors integrated with a cylindrical gearbox with 19:1 ratio. Additional reduction and power transmission are achieved by using chain and cogwheels as an intermediate step. The driving system is a classic 2.0-class structure. Maneuvering is performed by applying the difference in speeds of the wheels. The robot is powered by two 12 V batteries with 12 Ah capacity each. Voltage for powering the electronics (5 V) is obtained by using a step-down transformer.

2.1. Control System of the Mobile Platform

The robot control is presented in Figure 8. Driving the robot is performed by controlling the speed of the wheels using PWM (Pulse Width Modulation) signals as well as direction signal from the ATMega 328p micro-controller in the standard Arduino Nano module. PWM signal travels to the motors via H-bridges powered by additional relay enabling power cutoff (remotely or in case of an emergency stop). Motor current measurement system has been implemented in the driver to monitor the potential overload [21]. The implementation of the control system is visible in Figure 9.
An additional camera was used to watch the area in front of the robot for manual steering. The video feed is wirelessly transmitted using a selected channel at the frequency of 5.8 GHz to remote control panel [22]. The operator steers the robot based on the video feed from the camera displayed on the external monitor, sending PPM (Pulse Position Modulation) wireless control signals from the remote manipulator via an independent 2.4 GHz band. The remote control panel structure is presented in Figure 10.

2.2. Robot Localization System

During the inspection it is important to precisely localize the place where a fault occurred. One of the systems that are precise enough for event localization is GPS. It can be successfully used for positioning in an open space with high reliability and easy to use interface. However, in the case of underground mines there is no possibility for GPS communication in the underground conditions, so for such a use case the Ultra-Wideband (UWB) transceivers were proposed for the localization system (see Figure 11).
The localization system is based on DWM1001 modules equipped with firmware, which ensures bi-directional distance estimation (TWR) and real-time localization system (RTLS). Inside the module, besides the system directly responsible for distance estimation (UWB Transceiver) there is also a Bluetooth® Microprocessor Nordic nRF52832 and high-performance three-axis linear accelerometer with digital I2C/SPI serial interface standard output LIS2DH12 [23]. Configuration of the module is realized via API interface which can be accessed using various interfaces such as UART, SPI, i2C, and BT. Specific modules configured as Anchor (A) play a role of fixed beacons, and a module installed on the robot operates as a tag (T).
The bi-directional communication and data exchange between the modules as well as the known position in space of each anchor allows us to precisely calculate distances between them and using trilateration algorithms to estimate the position of a tag in space (see Figure 12).
The inspection itself is carried out by driving the robot along the conveyor keeping close distance to it. Deviation of the measurement system in a vertical direction with respect to the conveyor is expected to be minimal, probably negligible. Hence, the number of localization modules can be very limited.
Modules configured as anchors have been installed along the conveyor in such spacing that a tag is always in the range of at least one of them, see Figure 13. For the purpose of the experiment the beacon has been placed on a stand (Figure 14). During the inspection the position relative to the unique antenna in parallel to the sensor data. Hence, it is possible to localize the place of any event. Using the proposed system it is possible to achieve an accuracy of 10 cm.

2.3. Onboard Sensors

While at the moment the authors focus on hotspot detection based on thermal imaging, it is planned to equip the robot with an extensive sensor suite. Considering the fact that the final result of the diagnostic mission is the ability to provide all the necessary information to the maintenance crew and actually send people to the location, it is important to ensure that the environment is safe for them to arrive. Hence, it is crucial to be able to monitor additional parameters, such as air temperature and humidity, the concentration of dangerous gases such as CO, H 2 S or NO x , or the velocity of air current (some corridors are used for ventilation and air currents can be really strong). Besides the safety parameters, an inspection of other features is planned (i.e., idler set dislocation or internal faults such as defects of bearings possible to be detected via vibroacoustic data analysis).
To address those needs, besides the already present thermal and visual cameras, we plan to equip the robot with sensors such as LIDAR, gas concentration sensors, temperature and humidity sensors, air velocity sensor, and a microphone for noise measurements.
The algorithm for reading and recording sensory data is shown in Figure 15. The application works in two threads. One thread corresponds to the reading of location data, the other thread for reading data from cameras.

2.4. Inspection Data Analysis Module

At the current stage of development the inspection is focused on hotspot detection using thermal imaging. Data analysis is carried out offline (ultimately to be carried out in real time during the inspection). First, the image is acquired, then the threshold based on color is performed. When the detected area has a sufficiently large surface, a message is generated. Besides, the event view along with the location is saved and forwarded for verification by the service team. The algorithm was implemented in Python using the OpenCV library [24], and the general flowchart is presented in Figure 16.

2.5. Data Transmission and Recording

Data acquisition and transmission take place in parallel using several channels. Thermal and RGB images are registered locally within the board computer. Additionally, together with the vision, the robot position relative to the localization system and time information is recorded, which allows us to synchronize two independent camera feeds, as well as identify the event location such as a fault. To confirm the correctness of the view, it is possible to see the camera feed via WiFi (for example using remote desktop). Downloading of the data takes place after the mission ends.
The structure of the system is prepared in a way that it is possible to conduct a mission autonomically; however, in the early stage of the development the robot works in teleoperation mode. Further publications will describe more results related to autonomous applications.

3. The Use Case Description

3.1. A Belt Conveyor Test Rig

The biggest belt conveyor research center in Europe is located in the GEO3EM Research Center at the Wroclaw University of Technology, Faculty of GeoEngineering Mining and Geology. One of the available test rigs is a 7m belt conveyor test rig with artificially introduced belt damages, idlers in different conditions and ability to work under various belt speeds (Figure 17). This test rig has been developed for research related to belt conveyor diagnostics and maintenance. In this paper, we used this test rig to investigate the application of mobile inspection robot (UGV).

3.2. Plan of the Experiment

The experiment consisted of two components: traveling along a conveyor belt during his operation (see Figure 13 and Figure 18) and inspection data acquisition.

3.3. Results

In this section we present results of experiments performed in lab conditions. In Figure 19 we present an infrared thermography image with one of idler with significantly higher temperature. At the same time RGB images have been acquired by the second camera. Hot areas detection was performed according to the algorithm presented in Figure 16.

4. The Inspection Data Processing—Methods and Results

In this section, we will present the results of our experiment in the lab as well as results of image processing and analysis. Our procedure consists of 3 steps: infrared image analysis, RGB image analysis, and fusion of these results to validate proper hot spot detection.

4.1. IR Data Processing for Hot Spot Detection

From image presented in Figure 19 we extracted a part of image with possible hot spot, see Figure 20. Using tools from Matlab (Image Segmentation Using the Color Thresholder App) we segmented hot spot as presented in Figure 21.

4.2. RGB Image Data Processing for Idler Detection

By analogy to the IR image, we selected an RGB image with a belt conveyor on it. It was acquired at the same time as the IR image, with the same localization of robot. Using the same Matlab tool as before (Image Segmentation Using the Color Thresholder App) we segmented the image presented in Figure 22 as presented in Figure 23.

4.3. The Fusion of RGB Image and IR Data Processing for Hot Idler Detection

Along the belt conveyor route, one may detect many hot spots. To validate that detected hot spot as detected hot idler it is proposed to fuse RGB image and IR data. If segmented areas will be similar, one can conclude that detected hot spot is in fact the idler. The idea of comparing two types of images is presented in Figure 24. In Figure 25 we present two segmented images. Based on shape and location we can confirm that the detected hot spot is a damaged idler.

5. Discussion

The experiment in the lab is usually an approximation of real environment use cases. Indeed, we are aware and we expect that the same experiment in the main will bring new challenges. However, we checked that usage of an infrared camera is possible and the quality of the IR picture is good enough to perform hot idler detection, see Figure 26. In this figure, one can identify three idlers with higher temperatures (above the belt).
The next step will be testing the prototype in the underground mine and collecting inspection data for various belt conveyors.

6. Conclusions

The UGV mobile robot was proposed as a tool for inspection data collecting for belt conveyor maintenance. The robot, driven by a radio-based remote controller operated by humans, was tested in lab conditions in the GEO3EM Research Center at Wroclaw University of Science and Technology. Artificially damaged idler (bearing fault) was introduced, so as a result we were able to use Infrared Thermography to identify hot spots along the conveyor route. The hot spot was detected using standard techniques for image segmentation available in Matlab software. The same approach was used for idler detection using an RGB image. To validate that detected hot spot is the damaged idler, we fused an RGB image with an IR image. There is still a lot of work related to operator-human interaction or robot autonomy, and locomotion of a robot in the mining environment remains difficult. Automation of image analysis is also a challenge. However, we believe that, at this stage, we can send an inspection robot to do inspection in teleoperation mode along the conveyor route without human assistance. The mission could be limited to data acquisition, up to several hundred meters we can transmit data from robot to local computer at the beginning of the conveyor. Considering that the maximum length of the conveyor is 1000 m—the inspection could be done in real time if two access points (beginning and end of the conveyor route) are available. Knowing that each drive unit has sensors connected to SCADA—it is practically possible.

Author Contributions

Conceptualization, J.S. and R.Z.; Methodology, R.Z. and J.S.; Investigation, J.S., J.W. and R.Z.; Mobile robot design, sensory system and software, J.S.; Idler detection software, R.Z. and J.S.; Data acquisition software, J.S.; Writing original draft preparation, R.Z. and J.S.; Formal analysis, R.B.; Writing review and editing, J.W.; Funding acquisition, R.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported by EIT Raw Materials GmbH under the Framework Partnership Agreement (Autonomous Monitoring and Control System for Mining Plants-AMICOS).

Acknowledgments

This activity has received funding from the European Institute of Innovation and Technology (EIT), a body of the European Union, under the Horizon 2020, the EU Framework Programme for Research and Innovation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ten Technologies with the Power to Transform Mining. MINE Mag. 2014. Available online: https://www.mining-technology.com/features/featureten-technologies-with-the-power-to-transform-mining4211240/ (accessed on 14 May 2020).
  2. Liu, X.; Pang, Y.; Lodewijks, G.; He, D. Experimental research on condition monitoring of belt conveyor idlers. Measurement 2018, 127, 277–282. [Google Scholar] [CrossRef]
  3. Król, R.; Kisielewski, W. Research of loading carrying idlers used in belt conveyor—Practical applications. Diagnostyka 2014, 15, 67–73. [Google Scholar]
  4. Król, R.; Kisielewski, W. The influence of idlers on energy consumption of belt conveyor. Min. Sci. 2014, 21, 61–72. [Google Scholar]
  5. Cunha, F.; Youcef-Toumi, K. Ultra-Wideband Radar for Robust Inspection Drone in Underground Coal Mines. In Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 21–25 May 2018; pp. 86–92. [Google Scholar]
  6. Cao, X.; Zhang, X.; Zhou, Z.; Fei, J.; Zhang, G.; Jiang, W. Research on the Monitoring System of Belt Conveyor Based on Suspension Inspection Robot. In Proceedings of the 2018 IEEE International Conference on Real-time Computing and Robotics (RCAR), Kandima, Maldives, 1–5 August 2018; pp. 657–661. [Google Scholar]
  7. Ge, F.; Moore, W.; Antolovich, M.; Gao, J. Robot learning by a mining tunnel inspection robot. In Proceedings of the 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Daejeon, South, 26–28 November 2012; pp. 200–204. [Google Scholar]
  8. Roh, S.G.; Ryew, S.M.; Yang, J.H.; Choi, H.R. Actively steerable in-pipe inspection robots for underground urban gas pipelines. In Proceedings of the IEEE International Conference on Robotics and Automation (Cat. No.01CH37164), Seoul, Korea, 21–26 May 2001; Volume 1, pp. 761–766. [Google Scholar]
  9. Bing, J.; Sample, A.P.; Wistort, R.M.; Mamishev, A.V. Autonomous robotic monitoring of underground cable systems. In Proceedings of the 12th International Conference on Advanced Robotics, Seattle, WA, USA, 17–20 July 2005; pp. 673–679. [Google Scholar]
  10. Zimroz, R.; Hutter, M.; Mistry, M.; Stefaniak, P.; Walas, K.; Wodecki, J. Why Should Inspection Robots be used in Deep Underground Mines? In Proceedings of the 27th International Symposium on Mine Planning and Equipment Selection—MPES 2018; Widzyk-Capehart, E., Hekmat, A., Singhal, R., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 497–507. [Google Scholar]
  11. Li, H.; Savkin, A.V.; Vucetic, B. Autonomous Area Exploration and Mapping in Underground Mine Environments by Unmanned Aerial Vehicles. Robotica 2020, 38, 442–456. [Google Scholar] [CrossRef]
  12. Miller, I.D.; Cladera, F.; Cowley, A.; Shivakumar, S.S.; Lee, E.S.; Jarin-Lipschitz, L.; Bhat, A.; Rodrigues, N.; Zhou, A.; Cohen, A.; et al. Mine Tunnel Exploration Using Multiple Quadrupedal Robots. IEEE Robot. Autom. Lett. 2020, 5, 2840–2847. [Google Scholar] [CrossRef] [Green Version]
  13. Papachristos, C.; Khattak, S.; Mascarich, F.; Alexis, K. Autonomous Navigation and Mapping in Underground Mines Using Aerial Robots. In Proceedings of the 2019 IEEE Aerospace Conference, Big Sky, MO, USA, 2–9 March 2019; pp. 1–8. [Google Scholar]
  14. Thrun, S.; Thayer, S.; Whittaker, W.; Baker, C.; Burgard, W.; Ferguson, D.; Hahnel, D.; Montemerlo, D.; Morris, A.; Omohundro, Z.; et al. Autonomous exploration and mapping of abandoned mines. IEEE Robot. Autom. Mag. 2004, 11, 79–91. [Google Scholar] [CrossRef] [Green Version]
  15. Grehl, S.; Sastuba, M.; Donner, M.; Ferber, M.; Schreiter, F.; Mischo, H.; Jung, B. Towards virtualization of underground mines using mobile robots—From 3D scans to virtual mines. In Proceedings of the 23rd International Symposium on Mine Planning & Equipment Selection, Johannesburg, South Africa, 9–11 November 2015. [Google Scholar]
  16. Maity, A.; Majumder, S.; Ray, D.N. Amphibian subterranean robot for mine exploration. In Proceedings of the 2013 International Conference on Robotics, Biomimetics, Intelligent Computational Systems, Jogjakarta, Indonesia, 25–27 November 2013; pp. 242–246. [Google Scholar]
  17. Murphy, R.R.; Kravitz, J.; Stover, S.L.; Shoureshi, R. Mobile robots in mine rescue and recovery. IEEE Robot. Autom. Mag. 2009, 16, 91–103. [Google Scholar] [CrossRef]
  18. Zhu, J.; Gao, J.; Li, K.; Lin, W.; Bi, S. Embedded control system design for coal mine detect and rescue robot. In Proceedings of the 2010 3rd International Conference on Computer Science and Information Technology, Chengdu, China, 9–11 July 2010; Volume 6, pp. 64–68. [Google Scholar]
  19. Green, J. Mine rescue robots requirements Outcomes from an industry workshop. In Proceedings of the 2013 6th Robotics and Mechatronics Conference (RobMech), KwaZulu-Natal, South Africa, 30–31 October 2013; pp. 111–116. [Google Scholar]
  20. Liu, G.; Zhu, L.; Han, Z.; Zhao, J. Distribution and communication of multi-robot system for detection in the underground mine disasters. In Proceedings of the 2009 IEEE International Conference on Robotics and Biomimetics (ROBIO), Guilin, China, 18–22 December 2009; pp. 1439–1444. [Google Scholar]
  21. Szrek, J.; Arent, K. Measurement system for ground reaction forces in skid-steering mobile platform rex. In Proceedings of the 2015 20th International Conference on Methods and Models in Automation and Robotics (MMAR), Międzyzdroje, Poland, 24–27 August 2015; pp. 756–760. [Google Scholar]
  22. Szrek, J.; Wojtowicz, P. Idea of wheel-legged robot and its control system design. Bull. Pol. Acad. Sci. Technol. Sci. 2010, 58, 43–50. [Google Scholar] [CrossRef]
  23. DWM1001 Module Documentation. Available online: https://www.decawave.com/product/dwm1001-module/ (accessed on 14 May 2020).
  24. OpenCV Documentation: Changing Colorspaces. Available online: https://docs.opencv.org/trunk/df/d9d/tutorial_py_colorspaces.html (accessed on 13 May 2020).
Figure 1. A belt conveyor drive: a temperature sensor location (a), an electric motor, a coupling and a gearbox (b).
Figure 1. A belt conveyor drive: a temperature sensor location (a), an electric motor, a coupling and a gearbox (b).
Applsci 10 04984 g001
Figure 2. The conveyor system in KGHM OZG Polkowice-Sieroszowice mine. Each elementary segment is a separate conveyor, scale of the system makes frequent inspection difficult. Each arrow means conveyor, each circle bunker for ore storage.
Figure 2. The conveyor system in KGHM OZG Polkowice-Sieroszowice mine. Each elementary segment is a separate conveyor, scale of the system makes frequent inspection difficult. Each arrow means conveyor, each circle bunker for ore storage.
Applsci 10 04984 g002
Figure 3. One Control Room in KGHM OZG Polkowice-Sieroszowice mine (source: http://polskamiedz.wp.pl/artykul/kopalnie-kghm-jak-w-filmowym-avatarze-gornik-pod-ziemia-najmniej-potrzebuj-dzis-kilofu).
Applsci 10 04984 g003
Figure 4. Intervention of the inspection team.
Figure 4. Intervention of the inspection team.
Applsci 10 04984 g004
Figure 5. Conveyor belt route inspection currently.
Figure 5. Conveyor belt route inspection currently.
Applsci 10 04984 g005
Figure 6. A scheme of the chassis of the robot for inspection in underground mine.
Figure 6. A scheme of the chassis of the robot for inspection in underground mine.
Applsci 10 04984 g006
Figure 7. A robot for belt conveyor inspection.
Figure 7. A robot for belt conveyor inspection.
Applsci 10 04984 g007
Figure 8. A concept of control system.
Figure 8. A concept of control system.
Applsci 10 04984 g008
Figure 9. Hardware of control system.
Figure 9. Hardware of control system.
Applsci 10 04984 g009
Figure 10. Mobile platform remote control panel.
Figure 10. Mobile platform remote control panel.
Applsci 10 04984 g010
Figure 11. Sensory and location module.
Figure 11. Sensory and location module.
Applsci 10 04984 g011
Figure 12. Localization of Tag.
Figure 12. Localization of Tag.
Applsci 10 04984 g012
Figure 13. A plan of the experiment in the lab.
Figure 13. A plan of the experiment in the lab.
Applsci 10 04984 g013
Figure 14. UWB anchor.
Figure 14. UWB anchor.
Applsci 10 04984 g014
Figure 15. Reading and recording of sensory data algorithm.
Figure 15. Reading and recording of sensory data algorithm.
Applsci 10 04984 g015
Figure 16. An algorithm of hot area detection.
Figure 16. An algorithm of hot area detection.
Applsci 10 04984 g016
Figure 17. Belt conveyor test rig in GEO3EM Research Center.
Figure 17. Belt conveyor test rig in GEO3EM Research Center.
Applsci 10 04984 g017
Figure 18. Robot in action—A view on a laptop screen.
Figure 18. Robot in action—A view on a laptop screen.
Applsci 10 04984 g018
Figure 19. An infrared thermography image with one of idler with significantly higher temperature.
Figure 19. An infrared thermography image with one of idler with significantly higher temperature.
Applsci 10 04984 g019
Figure 20. Localized neighbourhood of a hotspot.
Figure 20. Localized neighbourhood of a hotspot.
Applsci 10 04984 g020
Figure 21. Segmented hotspot.
Figure 21. Segmented hotspot.
Applsci 10 04984 g021
Figure 22. A Visual reference for hotspot localization.
Figure 22. A Visual reference for hotspot localization.
Applsci 10 04984 g022
Figure 23. Segmented idlers.
Figure 23. Segmented idlers.
Applsci 10 04984 g023
Figure 24. An infrared thermography image and RGB image fusion for one of idler with significantly higher temperature.
Figure 24. An infrared thermography image and RGB image fusion for one of idler with significantly higher temperature.
Applsci 10 04984 g024
Figure 25. Result of infrared thermography image and RGB image fusion for one of idler with significantly higher temperature.
Figure 25. Result of infrared thermography image and RGB image fusion for one of idler with significantly higher temperature.
Applsci 10 04984 g025
Figure 26. Belt Conveyor as IR picture.
Figure 26. Belt Conveyor as IR picture.
Applsci 10 04984 g026

Share and Cite

MDPI and ACS Style

Szrek, J.; Wodecki, J.; Błażej, R.; Zimroz, R. An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection. Appl. Sci. 2020, 10, 4984. https://doi.org/10.3390/app10144984

AMA Style

Szrek J, Wodecki J, Błażej R, Zimroz R. An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection. Applied Sciences. 2020; 10(14):4984. https://doi.org/10.3390/app10144984

Chicago/Turabian Style

Szrek, Jarosław, Jacek Wodecki, Ryszard Błażej, and Radoslaw Zimroz. 2020. "An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection" Applied Sciences 10, no. 14: 4984. https://doi.org/10.3390/app10144984

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop