1. Introduction
Public safety has always been an area of great importance for every country, and fire-fighting plays a significant role. In recent years, various types of fire extinguishing equipment have been developed towards intelligence and automation. Additionally, as the main equipment for fire-fighting, unmanned firefighting robots have attracted increasingly extensive attention [
1,
2]. Meanwhile, various types of fire-fighting robots have been designed and manufactured [
3,
4,
5,
6]. Among them, conventional fire-fighting equipment, such as crawler fire-fighting robots, have been widely adopted by the fire brigade.
Table 1 lists the functional characteristics of several current mainstream fire robots in the world [
7,
8,
9,
10]. It is not difficult to find that the main role of current fire-fighting robots is to replace firefighters in order to enter some special environments to carry out fire-fighting work, such as high temperature, dense smoke, or narrow areas. The developed robots have made a very significant contribution to reducing the work intensity and casualties of firefighters. Furthermore, various vision sensors that are mounted on the robot have provided essential decision-making information for firefighters’ operations. However, the issue of final decision-making instruction still depends on the firefighter’s judgement of fire state, and the returned information from the visual sensor could only be viewed by firefighters. Under such a condition, the firefighting results would mainly depend on the firefighter’s execution ability and work experience. Factually, fire field conditions are changing constantly, for example, there would be combustion status [
11], water supply pressure fluctuations [
12], air heat convection, etc., which require the firefighters to continuously modify operations on firefighting robots. On the one hand, subtle position changes of the robot are difficult to be found by firefighters accurately during operation, which may directly cause inaccurate water jet trajectory. On the other hand, uninterrupted operation reduces fire extinguishing efficiency and leads to an increase in water consumption.
In the early 21st century, a two-nozzle with water and foam discharged at the speed of 5000 and 3000 L/min. respectively was developed by the Tokyo fire department and applied in firefighting [
13]. Fang [
14] designed a fire robot for railway tunnel fire that was capable of fire detection, alarm and extinguishment at the early stage. Kuo et al. [
15] designed a fire-fighting robot with three flame sensors for smart building fire protection. Similarly, a fire extinguishing robot that is mainly used indoors, such as residences, offices, and high-rise buildings, was developed by Khoo [
16]. The fire extinguishing robot was able to sense the flame and move to the fire location when the fire occurs in the house.
Furthermore, Fan et al. [
17] proposed a method combining Gmapping SLAM and fire source recognition image processing algorithm, which realized the functions of autonomous navigation and fire source recognition and detection for fire-fighting robots. Researchers at Virginia Tech University have carried out a series of studies on fire-fighting robots. The fusion system of infrared stereo vision and LiDAR was used for fire recognition research under different visibility [
18,
19]. Meanwhile, the infrared stereo was also used for water jet trajectory identification and positioning research. Subsequently, the automatic control of fire monitor yaw and pitch angles was realized, and key technologies include the jet trajectory model based on experimental data and fire location algorithm of the fusion system.
A brief summary of the function realization and research directions of the fire robot visual sensors is given here. Firstly, fire recognition and positioning. The benefit from the continuous advancement of computer vision technology, monocular vision [
20], binocular vision, infrared recognition, LiDAR, and unmanned aerial vehicles are widely used in fire recognition and location researches in various environments [
21]. Subsequently, precise and continuous fire suppression. Related researches mainly focus on the establishment and modification of jet trajectory models [
22,
23]. However, position parameters of jet trajectory are subject to some random disturbances, such as the robot’s roll, pressure fluctuations in the water supply line, and random wind. Actually, the fire field environment is constantly changing. In order to adapt intelligent fire-fighting robots to various working environments, the robot vision system is developing in the direction of multiple sensors, fusion sensors, and sub-categories. The water jet trajectory’s continuous and accurate delivery of fire position is also gaining increasing attention.
The main contribution of this paper is to propose an improved NFCV method for jet trajectory identification and parameterization. Firstly, an improved mixed Gaussian background subtraction method was applied in jet trajectory identification based on the analysis of fire robot working environment. Subsequently, jet trajectory geometric features, including length and area ratio, were proposed in order to eliminate the false detection results. Furthermore, superimposed radial centroid method were developed to jet trajectory parameterization and feature extraction for falling position prediction. Finally, a comparative analysis between the given jet trajectory recognition and falling position prediction results and the results obtained through previous method was carried out.
The chapters of this paper are arranged, as follows:
Section 2 briefly reviews previous work of the NFCV system, and several issues during application testing will be analyzed and summarized.
Section 3 describes the improvement of the NFCV system, including: parameters adjustment of mixed Gaussian background method, jet trajectory discrimination, parameterization, and feature extraction of jet trajectory.
Section 4 gives an introduction to the experiment and an analysis of some experimental results. Finally, some conclusions are given.
2. Previous Review
2.1. Near-Field Computer Vision
Regarding the research on the jet trajectory falling position prediction of fire monitor, the Near-Field Computer Vision (NFCV) method was proposed in our previous works [
24]. It is mainly based on the following considerations: firstly, it is difficult to capture a long jet trajectory image completely, while capturing the initial image is relatively easy. Fortunately, the features that are contained in initial jet trajectory can also be used to predict the falling position. Therefore, the Near-field computer vision method was proposed, including hardware and software systems, as shown in
Figure 1.
Figure 1a,b present the structural diagram of the NFCV system and fire robot equipped with the system, respectively. In particular, an infrared camera is used in intelligent fire monitoring to detect fire target and so as to adjust range of yaw angle. Therefore, it is beyond the scope of this article.
The near-field camera captures jet trajectory image and sends it back to the computer during jetting, which includes image preprocessing, trajectory parameterization, and falling position prediction program developed based on C++ and OpenCV, as shown in
Figure 1b. The main core functions include the following:
(1) Image preprocessing mainly includes perspective transformation, image enhancement, and trajectory segmentation. The near-field camera installed on the left side of fire monitor captures trajectory image from an oblique perspective, so perspective transformation is used in order to restore the image to the front view. Image enhancement is used to eliminate noise and highlight jet trajectory in the image. Jet trajectory is commonly white or bright, due to the fact that water tends to reflect light more easily. In the enhanced image, the brightness difference between jet trajectory and background are utilized in order to realize the jet trajectory segmentation.
(2) Trajectory parameterization mainly includes trajectory center position calculation and feature extraction. In the segmented image, trajectory parameterization is the prerequisite for feature extraction. However, the extracted jet trajectory from an image may still miss position parameters, due to residual noise. The mean position method was proposed and used to extract the jet trajectory position in the image and draw the trajectory curve. Suppose that the size of the image is
, and
represents the pixel value at
in the image; the jet trajectory ordinates in the binary image are expressed as:
where
a is the number of pixels that satisfies
.
is the jet trajectory ordinate in each column of the binary image. Based on the acquired trajectory position data, the least square method was used for parametric fitting of trajectory. Furthermore, the jet trajectory feature, including the start-point slope (SPS) and end-point slope (EPS), were extracted, which are expressed as:
where
y is fitted trajectory equation based on the jet trajectory position parameter in the image,
and
are the starting and ending point abscissa of the jet trajectory curve equation, respectively.
2.2. NFCV System Defects
In previous work, the proposed NFCV system has achieved good verification results in a static environment. However, many issues were discovered when the system was mounted on the fire-fighting robot during the application test, which are mainly manifested in:
Background interference: for walks of fire robots during the fire-fighting process, the background change of trajectory image captured by near-field camera is more dramatic. In particular, bright areas in the background may cause errors in trajectory segmentation. For example, bright sky background, standing water on the ground, etc.
Trajectory discriminant: meanwhile, there are other factors that cause deviations in jet trajectory detection. On the one hand, illumination changes cause sharp fluctuations in the brightness of jet trajectory in near-field image under outdoor environment. On the other hand, the state change of jet trajectory may lead to inconsistent brightness in image during jetting.
Figure 2 shows the segmented jet trajectory image.
Figure 2a–c are the wrongly detected results of jet trajectory, and
Figure 2d is the ideal jet trajectory segmentation result under the same camera parameters. It is not difficult to find that the false detection results mainly show that the background is mistaken for jet trajectory, such as
Figure 2a, and part of the jet trajectory is removed as noise, such as
Figure 2b,c.
Feature extraction: generally, jet trajectory features extraction will be affected by the fire monitor head and the shape of trajectory. The color image of jet trajectory captured by the near-field camera is shown in
Figure 3a. It is not difficult to find that the fire monitor head appears in the image and it was mistakenly detected as a part of jet trajectory, as shown in
Figure 3b. Ideally,
Figure 3c shows the theoretical incident angle of the jet trajectory, and it is indicated by the red arrow. In fact, in previous works, the leftmost side of the trajectory image is regarded as the starting position for calculating the jet trajectory incident angle, as shown by the red arrow in
Figure 3d. Obviously, the mistakenly detected fire monitor head and application of the image leftmost position as the position of the incident angle may lead to inaccurate calculation results.
In general, when the proposed near-field vision system is mounted on a fire-fighting robot for functional testing, more complex light environment, changes of background state, and defective feature extraction methods may lead to the inaccurate prediction of the jet trajectory falling position. This phenomenon indicates that the application of the proposed NFCV system has defects in application on fire robots and it cannot meet the accuracy requirements of field testing and application. Therefore, improvement around trajectory misdetection, trajectory result discrimination, and trajectory feature extraction are the main focus of this paper.
4. Experimental and Discussions
The improved near-field vision system was mounted on a fire robot and its performance in the factory environment is tested. Firstly, the experimental system was briefly explained, including the hardware structure and improved NFCV system workflow. Furthermore, experimental data and analysis results are given.
4.1. Experiment Setup
The hardware system of the experiment mainly consists of the NFCV system and fire robot. Additionally, the experimental platform includes near-field camera, bracket, fire monitor, fire robot, and other auxiliary equipment, as shown in
Figure 7. Furthermore, a near-field camera was connected to a personal computer with a self-designed image processing program, which is mounted 25 cm left of the fire monitor through a bracket. The selected fire monitor model is PS20–50, which is a common equipment in the market. The independently developed computer program could realize the yaw and pitch angle adjustment of the fire monitor through the development board. It must be pointed out that the NFCV system follows fire monitor through horizontal rotation. In other words, near-field camera is only used to capture the changes in the jet trajectory state that are caused by the pitch angle adjustment of the fire monitor.
The complete work process was expressed in
Figure 8 in order to further illustrate the work process of the proposed near-field vision system and our latest work progress.
Image preprocessing: umage preprocessing is the first step of almost all vision systems, including perspective transformation, image morphology operations, and image enhancement operations. Particularly, perspective transformation is used to restore the front view of the jet trajectory. Other operations include camera distortion correction, perspective transformation matrix calibration, etc.
Jet trajectory segmentation: the segmentation method that is adapted to application test of the NFCV system is one of the focuses in this paper. Parameter optimization adjustment of the background subtraction method that is based on the mixed gaussian model is carried out in order to apply it to the collection environment of jet trajectory, which mainly involves variance threshold and background update rate; a detailed description can be found in
Section 3.1.1.
Trajectory discrimination: unsatisfactory jet trajectory detection results may still occur due to other interference. Therefore, two important discriminant parameters, the length and area proportions, were proposed for ideal jet trajectory segmentation results; detailed information can be found in
Section 3.1.2.
Trajectory parameterization: trajectory parameterization is an important guarantee for feature extraction, and existing issues are proposed and analyzed in
Section 2.2. The radial centroid method was developed for trajectory parameterization and secondary jet trajectory drawing based on the mean position method, which is also one of our core works; a detailed description can be found in
Section 3.2.
Prediction model: based on trajectory features and experimental data, the least square method was used in order to establish multiple regression model for falling position prediction of the jet trajectory. Additionally, the least squares method is commonly used for numerical regression, and thereby would not be repeated in this paper.
Figure 9 shows the processed image results after each step of the improved NFCV system, and
Table 2 lists the corresponding average processing time for each step. The improved background subtraction method and developed trajectory discrimination parameters are effective for the jet trajectory segmentation in the original image and a satisfactory fitting result of the jet trajectory equation was achieved based on the trajectory parameterization process with the radial centroid method.
4.2. Experiment Results and Analysis
The improved method for NFCV system includes Mixed Gaussian background subtraction, jet trajectory discrimination, and trajectory parameterization. In our works, a comparative analysis of the prediction results of falling points corresponding to the improvement work is carried out. The predicted results were compared with previous results in order to clearly show the improvement of results. In the experiment system, the water supply pressure of the fire robot was stably provided by a centrifugal pump. Furthermore, water outlet pressure at the fire monitor head was 0.115 MPa, since most of the pressure energy has been converted into kinetic energy in the internal pipeline of the fire robot. In general, the commonly used pitch angle of the fire monitor is 20–40
during the fire extinguishing process. Furthermore, the abovementioned methods were used incrementally for falling position prediction with 104 captured jet trajectory images when the pitch angle of fire monitor was 35
, and the results are shown in
Figure 10.
Obviously, the enhancement of every improvement work on prediction results is significant. Firstly, when compared with previous jet trajectory segmentation method, the mixed Gaussian background subtraction method of parameter optimization has significantly reduced the predicted results error from 1.36 m to 0.69 m; the comparison curve is shown in
Figure 10a. Clearly, the mixed Gaussian background subtraction method is more suitable for jet trajectory segmentation of the NFCV system in the current testing environment. In particular, the higher variance threshold and slower background update rate improve the accuracy of the predicted results significantly.
Figure 10b shows the comparison between the results obtained through the mixed Gaussian background subtraction method and after trajectory discrimination. It can be seen that the parameters of length and area proportion have effectively filtered relatively large incorrect prediction values. In the magnified box pointed by the black arrow, it is easy to find that the segmented jet trajectories that not satisfied the judgment parameter were eliminated by the developed method. The results manifested in the figure are the predicted results that have not been updated, as shown in the black frame. Among the 104 frames of captured images, approximately 21 frames were eliminated, accounting for 20.1%.
Figure 10c shows the comparison between results that were obtained by trajectory discrimination and mean position method increased. The mean position method was used for preliminary trajectory parameterization, which is simply used in previous work as well, as introduced in
Section 4.1. Obviously, the mean position method slightly improves the prediction accuracy with average error reduced from 0.49 m to 0.34 m. Therefore, the radial centroid trajectory parameterization based on mean position method was necessary, and the comparison curve of predicted results is shown in
Figure 10d. It can be clearly seen that the secondary parameterized trajectory after radial centroid method could realize accurate falling position prediction, with an average error of 0.10. Overall, the predicted position that is represented by the red dot in
Figure 10a gradually approaches the black dotted line, which is the experimentally calibrated position of 13.8 m, with the improved method sequentially increased for the NFCV system.
Figure 11 shows the mean and variance of prediction results for each improved method increased sequentially. Declining data results indicated that the improved method was positive and effective in improving the reliability of the NFCV system, especially the mixed Gaussian background subtraction method after parameter adjustment and the trajectory parameterization based on the radial centroid method. Furthermore, successively decreasing variance indicated that the stability of the NFCV system has also been further improved, which plays a positive role in improving and realizing the application feasibility of the system in intelligent fire robots.
Figure 12 shows the jet trajectory falling position prediction error under different pitch angles and fluid supply pressures that are based on the improved NFCV system. It is not difficult to find that the falling position average error value increases with the supply pressure, and the maximum error is 0.09 m at 0.115 MPa of the fire monitor outlet pressure. Similarly, the pitch angle of the fire monitor is negatively related to average error, and the maximum error is around 35
. It needs to be pointed out that the error has little effect on the efficiency of fire extinguishing, because the fire extinguishing agent will spread out into an elliptical area with a large area at the terminal of the jet trajectory.
Table 3 provides the detailed results, mainly including the classification of prediction errors and comparison with previous methods. It was not difficult to find that the errors of the proposed improved methods were all less than 0.5 m, and more than half of the errors were less than 0.1 m. In other words, the improved NFCV system basically satisfies the performance requirement of intelligent fire robots for fire-fighting, and even for smaller fire spots. When compared with previous works, the improved method has played a huge role in improving the functionality and reliability of the NFCV system. Meanwhile, the improved algorithm has not increased image processing and program calculation time, which could fully satisfy no less than 1 Hz sampling frequency of the near-field camera.
5. Conclusions
An improved method for the NFCV system was proposed in this paper, including: the mixed Gaussian background method after parameter adjustment, jet trajectory discrimination, parameterization, and feature extraction of jet trajectory. The experimental results suggest that considering the complex light environment in the application of the NFCV system, higher variance threshold and slower background update rate play a positively significant role in reducing the false detection of jet trajectory. Furthermore, through the analysis of jet trajectory shape characteristics, the proposed jet trajectory discrimination method that is based on the length and area ratio parameters assisted the elimination of some stubborn false detection results. Finally, jet trajectory parameterization of radial centroid method was superimposed based on mean position method, which further increases the accuracy of the prediction results. The prediction error of no more than 0.5 m and more than 60% less than 0.1 m indicated that the improved NFCV system basically satisfies the performance requirement of intelligent fire robots for fire-fighting, and even for smaller fire spots.
Our future work mainly includes two aspects. On the one hand, it is necessary to further improve the prediction accuracy of the NFCV system in various practical environments. After all, there may be many interference factors, which have not been considered yet. On the other hand, the application of the NFCV system should be further expanded on an intelligent fire robot. For example, aiming and tracking of fire, combined with infrared vision, coping with the change of jet trajectory falling position due to the supply pressure fluctuation of fire robot, etc.