Next Article in Journal
Investigation of Yarrow Essential Oil Composition and Microencapsulation by Complex Coacervation Technology
Next Article in Special Issue
EGS-YOLO: A Fast and Reliable Safety Helmet Detection Method Modified Based on YOLOv7
Previous Article in Journal
Advancements in Semi-Active Automotive Suspension Systems with Magnetorheological Dampers: A Review
Previous Article in Special Issue
Image Registration Algorithm for Stamping Process Monitoring Based on Improved Unsupervised Homography Estimation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Deep-Learning-Based Approach to the Classification of Fire Types

by
Eshrag Ali Refaee
*,†,
Abdullah Sheneamer
and
Basem Assiri
Department of Computer Sciences, Jazan University, Jazan 45142, Saudi Arabia
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Appl. Sci. 2024, 14(17), 7862; https://doi.org/10.3390/app14177862
Submission received: 29 July 2024 / Revised: 24 August 2024 / Accepted: 27 August 2024 / Published: 4 September 2024

Abstract

:
The automatic detection of fires and the determination of their causes play a crucial role in mitigating the catastrophic consequences of such events. The literature reveals substantial research on automatic fire detection using machine learning models. However, once a fire is detected, there is a notable gap in the literature concerning the automatic classification of fire types like solid-material fires, flammable gas fires, and electric-based fires. This classification is essential for firefighters to quickly and effectively determine the most appropriate fire suppression method. This work introduces a benchmark dataset comprising over 1353 manually annotated images, classified into five categories, which is publicly released. It introduces a multiclass dataset based on the types of origins of fires. This work also presents a system incorporating eight deep-learning models evaluated for fire detection and fire-type classification. In fire-type classification, this work focuses on four fire types: solid material, chemical, electrical-based, and oil-based fires. Under the single-level, five-way classification setting, our system achieves its best performance with an accuracy score of 94.48%. Meanwhile, under the two-level classification setting, our system achieves its best performance with accuracy scores of 98.16% for fire detection and 97.55% for fire-type classification, using the DenseNet121 and EffecientNet-b0 models, respectively. The results also indicate that electrical and oil-based fires are the most challenging to detect.

1. Introduction

The recent development of artificial-intelligence-based technologies has helped the development of many fields, such as safety and security [1,2,3,4,5]. One of the main fields of safety and security is firefighting. Fires are hazardous, as they spread fast and produce massive heat and toxic smoke in a few seconds. The heat can range from 100 to 1000 degrees, while the smoke blocks vision. Despite the progress in constructing buildings resistant to fire and improving fire-prevention methods, our society still needs to address fires. In 2018, the National Fire Protection Association reported that a fire department in the United States responded to a fire every 24 s. In 2022, there were more than 1.5 million fires in the United States, marking a 12.2% increase from 2013 [6]. The number of fire-related deaths in 2022 was 3790, which is an 18.6% increase from 2013. The financial impact of fires has also risen, with a total estimated dollar loss of 18.1 billion USD in 2022, which is a 28.9% increase from 2013 [6]. As such, the aggravated cost of fires, including the lost souls and materials, can be devastating. Researchers can help advance this domain via empirical investigations of various methodologies utilizing real-life datasets.
Efforts in firefighting are not new. However, using AI algorithms can help significantly reduce the devastating consequences of fires. Researchers have focused merely on the automatic detection of fires as they exist. To better understand the situation, it is essential to know how a fire works in the first place. Three elements must exist to start a fire: oxygen, fuel, and heat. Removing one of them stops the fire.
Focusing on fire causes, researchers have found that the ignition of fires can stem from various factors and sources [7], including the following:
  • Thermal influence: the emanation of heat from solar radiation, electrical sources, or vehicular collisions stands out as a principal instigator of fire occurrences.
  • Combustion: among the conspicuous instigators of fire ignition are open flames originating from matches, lighters, candles, or smouldering remnants capable of initiating combustion in flammable substances.
  • Electrical dysfunctions: anomalies in electrical systems, encompassing compromised wiring, circuit overloads, or the impairment of electrical apparatus, tend to generate sparks or excessive heat, thereby precipitating fires.
  • Culinary mishaps: instances characterized by unattended cooking, oil overheating, the accumulation of grease, or the proximity of inflammable materials to cooking surfaces can readily incite conflagrations.
  • Smoking incidents: improperly disposing of cigarette remnants or ashes can enkindle combustible materials, particularly within environments harbouring flammable compounds or exhibiting arid conditions.
  • Chemical reactions: the juxtaposition or exposure of certain chemicals to elevated temperatures can instigate reactions that yield sparks or flames, thus engendering fire hazards.
  • Arson: the deliberate act of igniting fires, denoted as arson, may arise from criminal motivations or malicious intentions.
  • Frictional and mechanical stimuli: the frictional interaction between surfaces or the generation of sparks due to mechanical implements can act as catalysts for igniting flammable substances, notably within industrial settings.
  • Flammable liquids and gases: the mismanagement or inadequate containment of flammable liquids or gases such as gasoline, propane, or solvents can precipitate fire-related incidents.
  • Natural catastrophes: fires may be precipitated via natural phenomena such as lightning strikes, volcanic eruptions, or wildfires.
  • Human negligence: unintentional oversights, such as leaving appliances operational, failing to extinguish cigarettes properly, or haphazardly discarding flammable materials, contribute to the ignition of fires.
Moreover, fire characteristics are essential to fire detection and fire-type-recognition processes. Fire exhibits various features contingent upon its classification, yet firefighters routinely assess several overarching attributes:
  • Flames: characteristics including colour, shape, and intensity are pivotal indicators. For instance, a blue flame may denote a gas-fueled fire, whereas yellow or orange flames typically signify combustible material combustion.
  • Smoke: Attributes encompassing colour, density, and odour contribute to material identification. Dense, black smoke commonly signifies synthetic-material combustion, while lighter, greyish smoke suggests natural-material involvement.
  • Heat: fire intensity and temperature give firefighters insights into potential propagation rates and associated hazards.
  • Behavior: Fire dynamics, such as the spread rate, auditory cues, or the presence of crackling noises, furnish supplementary information to firefighters. It is imperative to note that these characteristics are general, with specific fire types potentially presenting unique attributes. Firefighters undergo training to discern these traits effectively, facilitating appropriate action.
Furthermore, fire types/classes should be recognized. The categorization of fires is fundamental in fire science and emergency response protocols. Fires are typically classified based on the materials fueling their combustion, with each type exhibiting distinct characteristics and posing unique challenges to containment and extinguishment [8]. According to [9], four primary classifications commonly referenced in fire science literature are as follows:
  • Class A fires: Class A fires involve combustible and solid materials such as wood, paper, cloth, and plastics. They are characterized by glowing embers and typically produce ash upon combustion. Class A fires are extinguished using water or other agents that cool the burning material and remove the heat source.
  • Class B fires: Class B fires involve chemical materials such as magnesium, titanium, potassium, and sodium. These fires can burn extremely high temperatures and produce intense, blinding light. Extinguishing Class B fires often requires specialized dry powders or sand-based agents that react with the metal to form a crust, cutting off the oxygen supply.
  • Class C fires: Class C fires involve energized electrical equipment or wiring. These fires pose unique challenges due to the risk of electric shock and the potential for re-ignition if the power source is not adequately de-energized. Extinguishing Class C fires requires non-conductive, specialized agents that do not create a conductive path to the electrical source, such as dry chemical powders or carbon dioxide.
  • Class D fires: Class D fires involve flammable liquids or gases, including gasoline, oil, grease, and solvents. They often produce visible flames and may spread rapidly, making them particularly hazardous. Extinguishing Class D fires typically involves smothering the flames with foam, dry chemical agents, or carbon dioxide to deprive them of oxygen.
Understanding the characteristics and appropriate extinguishing methods for each fire class is essential for firefighters and emergency responders to effectively mitigate the risks of various fire incidents. Additionally, ongoing research and advancements in fire science contribute to developing improved strategies and technologies for fire prevention, suppression, and safety. For instance, water-based suppression systems effectively extinguished Class A fires, particularly those involving combustible solids like wood or paper. Conversely, foam or carbon dioxide suppression systems proved more productive for Class B fires involving flammable liquids or gases. The response time plays a critical role in the operational effectiveness of firefighting systems. Systems with automatic detection mechanisms exhibited significantly reduced response times compared to manually activated systems. The evaluation of the existing firefighting systems yielded several noteworthy findings. It was observed that the efficacy varied depending on the type and scale of the fire. Water-based suppression systems effectively extinguished Class A fires, particularly those involving combustible solids like wood or paper. Conversely, foam or carbon dioxide suppression systems proved more productive for Class B fires involving flammable liquids or gases. Additionally, the performance of dry chemical extinguishers was particularly notable in combating Class C fires involving energized electrical equipment.
The ability to quickly and efficiently detect the presence of fire and determine the class or type of fire can significantly reduce possible loss and damages yielded from fires. This work presents a set of empirical investigations that detect the presence of fires and automatically classify the type of fire. It introduces a multiclass dataset based on the types of origin of fires. This information can help firefighters identify how to deal with a fire more efficiently, reducing risks and minimizing damage. Indeed, our methodology includes two different classification settings: a single-level classification structure and a two-level classification structure (pipeline). Firstly, this work experimented with a flat single-level classification structure in which all images were processed and classified into suitable classes. The classes were no fire and solid-material, chemical, electrical, or oil-based fires (five classes in total). Secondly, it experimented with two levels of nested classifications. It started with a binary classification for fire detection in which the classes were a fire or no fire. The images that were classified in the fire class entered another classification level to determine the type of fire. The second-level classification included four classes: solid-material, chemical, electrical, or oil-based fires. In addition, this work contributes a manually collected and annotated benchmark dataset of five classes that were carefully collected and released publicly to serve as a testbed for further research investigations.

2. Related Work

The literature shows that many researchers have been interested in investigating the issue of automatic fire detection using image processing. Due to the importance of early fire detection, a machine-learning-based approach has been heavily utilized to enhance the possibility of early automatic fire detection. Among the early attempts made by [10], the authors focused on the issue of reaching a fire location quickly through an automatic fire-recognition system. The proposed system utilized image processing in a MATLAB environment and RGB and YCbCr colour spaces to identify fires in a small dataset of 100 images. The results showed that the image colouring system influenced the accuracy of recognition. The authors reported accuracy scores of up to 90% with RGB and 100% with YCbCr since it reduced the influence of light on colors [10,11].
Another work, by [11], discussed the evaluation of rules for fire detection in outdoor vegetation-fire images. The authors used 31 rules specifically for wildland-fire pixel-color detection. The rules were evaluated on a dataset of 500 fire images, categorized by colour and smoke presence for fire pixels and by luminosity for non-fire pixels. The authors used logistic regression to learn from defined rules to predict the likelihood of a pixel being a fire pixel. The features used in logistic regression were derived from the rules defined for fire detection. They employed a dataset of 500 RGB images of outdoor vegetation fires, varying sizes (from 183 × 242 to 4000 × 3000 pixels) and formats (jpg, ppm, bmp). These images included various lighting conditions (sunny, cloudy, night, and day). Fire areas in the images were manually segmented to create ground-truth data. Fire pixels were labelled by colour type (red, orange, white–yellow, or other) and classified as smoke or smokeless using a support vector machine. They reported the best performance using LR at 0.91.
In 2016, [12] focused their work on the city of Cape Town, as it was declared one of the most fire-prone cities in South Africa. They proposed a system utilizing an ANN for wildfire risk assessment. The model used factors like climate and location as features to predict the risk rate of wildfire ignition for two different vegetation types. The system was trained on historical fire data from 2009 to 2015 and produced categorical outputs of low, moderate, high, and extreme. The authors reported an accuracy of up to 0.97 using their historical dataset.
The work of [13] addressed the challenges in fire detection due to environmental factors and scene variations. It proposed a new method incorporating colour-space information into the Scale Invariant Feature Transform (SIFT) algorithm for better feature extraction. The Incremental Vector Support Vector Machine (IV-SVM) classifier was then used to build a fire-recognition model. Experiments on real-life fire images showed that this method outperformed existing ones in accuracy and speed, making it highly promising for practical applications.
Later work by [14] focused on predicting and mapping fire susceptibility in Pu Mat National Park, Vietnam, using four machine learning methods: a Bayes network (BN), naïve Bayes (NB), a decision tree (DT), and multivariate logistic regression (MLR). The study used data from 57 historical fires and nine explanatory variables: elevation, slope degree, aspect, average annual temperature, drought index, river density, land cover, and distance from roads and residential areas. Feature selection: The relief-F feature selection method was used to determine the importance of these variables. During validation, the BN model had the highest AUC value (0.96), followed by the DT (0.94), NB (0.939), and MLR (0.937) models.
In [15], the authors presented a review of machine learning (ML) applications in wildfire science and management. The study spans the history of ML use in this field since the 1990s. The authors highlighted the most common machine-learning and deep-learning models primarily used for wildfire issues, such as random forests, decision trees, neural networks, support vector machines, K-nearest neighbours, Bayes networks, naïve Bayes, decision trees, and multivariate logistic regression. In recent years, they noticed that the artificial neural network (ANN) is the most commonly used fire detection algorithm. This aligns with the work of [16], wherein the authors utilized a machine-learning model for fire prediction in the coal storage industry. The researchers used artificial neural networks and convolutional neural networks for fire prediction.
The work of [17] proposed a system to detect early fire symptoms in smart homes using various sensors and fuzzy logic. They used sensors and real-time data to detect fire symptoms in smart houses.
In [18], the authors investigated the use of different fire sensors to enhance early fire detection in underground diesel-fuel storage areas, focusing on improving safety in underground mines. Diesel-fuel fire tests were conducted in a simulated underground storage area at the NIOSH Safety Research Coal Mine. The study concluded that smoke and flame sensors provide more rapid fire detection than CO sensors, with flame sensors showing the best overall performance. The findings suggested optimal sensor placements to enhance early fire detection and ensure the safety of underground miners.
The work of [19] explored integrating machine learning with mechanistic models to improve fire engineering and sciences (FESs). The paper advocated for using ML to address complex problems in FESs by showcasing their potential through various examples and recommended procedures. They used a dataset compiled from historical fire tests, sensor data, and simulation outputs. The paper concluded that leveraging machine learning and artificial intelligence techniques, mainly mechanical models, can significantly enhance fire engineering’s and fire sciences’ predictive capabilities and efficiency.
Similarly, [11] compared several image-processing-based fire-detection methods using rule-based and machine-learning (ML) approaches. The objective was to improve the accuracy and reliability of fire detection in wildland areas under various conditions. They combined the rules with a dataset of 500 outdoor images and logistic regression. The study concluded that the proposed logistic regression method, which incorporates rules using machine learning, provides the best performance for fire pixel detection. It demonstrated the potential for developing more robust fire detection techniques suitable for unstructured environments, addressing the limitations of individual rule-based methods.
In [20], the authors focused on energy applications wherein the type of fuel used necessitates specific chemical processes for optimal results. Misprocessing fuels can lead to waste and inaccurate conclusions, particularly when the material has already been processed, complicating classification. The authors employed a machine-learning approach to classifying fuels using proximate analysis results, which included fixed carbon, volatile matter, and ash content. Data were collected from the literature and classified into four categories: coals, woods, agricultural residue, and manufactured biomass. Three machine learning classifiers—K-nearest neighbour, support vector machines, and random forest—were used to develop prediction models. A hierarchical classification approach was implemented. The classifiers’ performances were evaluated using K-fold cross-validation, achieving 96% accuracy in the training phase and 92% in the testing phase. The study confirmed that machine learning, combined with proximate analysis, is a viable approach to fuel classification, offering a more systematic and reliable alternative to traditional methods.
In more recent work by [21], the authors discussed the development of a computational framework to model and simulate aerial drops of fire retardants in dangerous fire environments. The primary objective was optimizing firefighting operations and enhancing pilot safety using advanced technologies. Machine learning optimized firefighting strategies, focusing on parameters like plane velocity, angular velocity, initial position, particle size, sprayer amplitude, release times, and drop rates. The study argued for the feasibility of using a computational framework combining meshless discrete element modelling and machine learning in optimizing aerial firefighting operations.
The work of [22] aimed to develop a real-time fire detection system that identifies the presence of fire and classifies the type of fire based on the surrounding environment. The authors used the YOLOv5 model for both fire detection and classification. Two object detectors were trained separately: one for detecting fire and the other for identifying flammable objects in the environment. Based on the US standard of fire classes, the system classified fires as A, C, and K. The intersection-over-union (IoU) scores are calculated to associate detected fire with the identified flammable objects, determining the fire class. The authors created a comprehensive dataset for each label and further trained the YOLOv5 model to identify various objects like curtains, couches, and laptops, which are potential sources of fire. The system was tested on a video dataset of 150 samples.
In summary, the literature reveals significant interest among researchers from various fields in improving the accuracy of early fire detection systems. Extensive efforts have spanned a wide range of methods for recognizing fire symptoms, utilizing both historical fire data and real-time sensor-based data. The existing literature indicates that machine learning and rule-based approaches have achieved notable effectiveness, as evidenced by the high levels of accuracy reported (see Table 1). Our research integrates and extends previous work by introducing an additional dimension, emphasizing fire recognition and the automatic classification of fire types. This advancement has the potential to significantly aid firefighters by enhancing the efficiency and effectiveness of their operations, enabling them to determine the most appropriate firefighting methods for different fire scenarios.

3. Experimental Framework

Several preliminary experiments were conducted to determine the best possible experimental configurations. The work also followed a well-established structure for our experiments. This section outlines the experimental setup followed in this work.

3.1. Data Collection

Our investigation involved examining the currently publicly available datasets. A considerable number of existing datasets cover this topic. However, they all focus on binary classifying instances of fire vs. no fire. It became apparent that there is a lack of publicly accessible datasets that incorporate classification according to fire classes. Consequently, this work embarked on the project of creating a dataset that will be made publicly available to foster additional inquiries by scholars within the field and allow the reproduction of the results. To create a new dataset covering instances spanning the major fire classes, we opted to utilize some currently available binary (fire vs. no fire) datasets that mostly focus on natural wildfire [23,24], which essentially meant training models to automatically detect merely the existence of fire. In addition, new images are added and manually annotated into the main classes of fire. Table 2 shows the characteristics of our Fire Classification Dataset (FCD), intended to be released as a part of this work to allow further research explorations [25].
To annotate instances in our dataset, the guideline for different fire characteristics provided by the Fire Safe organization in the United Kingdom was used [9]. The Solid-material fire is the default class of fire images. In contrast, the electrical fire is detected at the beginning of a fire since it usually starts with an enlightened spark. Moreover, the flame of a chemical fire can be of different colours, such as green, blue, red, or violet. An oil fire can be recognized through its smoke colour and fire location [9]. To ensure the quality of the images used and the correctness of the annotation, the collected images were from different resources like video frames of actual, real-life events. For instance, Figure 1 shows a sample image of a real-life oil refinery fire incident in Iraq and was reported on the BBC’s news outlet [26]. The images in the dataset were carefully selected to cover a wide range of different possible scenarios, e.g., daylight and night effects. The images also capture various stages of fire, which can help identify as many features as possible for each fire type. Figure 2 shows an example image of a chemical fire with the colour of the flame as the most distinguishing feature [27].
Our preliminary experiments tested different settings for a data split. Specifically, we experimented with 80% training and 20% testing and another setting, which was 67% training and 33% testing. it was concluded that there was no significant difference. As such, all of our results in this work are reported using an 80% training and 20% testing data split.

3.1.1. Data Pre-Processing

Our preliminary experiments show that the trained models performed best with the following settings:
Image size: Images experimented on with different sizes, and the best results were attained using images in a dimension of 256 × 256. As such, all images were resized to a standard size with a dimension of 256 × 256 and saved in Joint Photographic Experts Group (JPEG format) (see Figure 3 and Figure 4).
Image color conversion: The images were converted from the blue, green and red (BGR)format to the red, green, and blue (RGB) format using OpenCV.
Appling blur: A Gaussian blur filter was applied to the images to smooth them.

3.2. The Algorithm and Learning Models

This section describes the general algorithm, as shown in Figure 5.
In Figure 5, the image came from image sources such as cameras, video clips, and datasets. The video clips were converted into frames of images, which were used as inputs in our model. Then, the pre-processing operations took effect on the images, as explained in Section 3.1.1. After that, each image was processed using a learning model to predict the final class. The classes were already mentioned in Table 2. The learning process could be divided into two approaches: single-level classification and two-level classification. In the learning phase of both approaches, a set of learning models known in the literature was employed to perform well during the classification tasks. Specifically, the experiments compared the following models’ performance:
  • MobileNetV2, a fifty-three-layer-deep convolutional neural network trained on more than a million images from the ImageNet database.
  • InceptionV3, a convolutional neural network for assisting in image analysis and object detection.
  • VGG-16, which is a deep convolutional neural network model used for image classification tasks. The network comprises 16 layers of artificial neurons, each working to process image information incrementally and improve the accuracy of its predictions.
  • EfficientNetV2l, which is a convolutional neural network type with a faster training speed and better parameter efficiency than previous models.
  • VGG-19, which is a nineteen-layer convolutional neural network trained on more than a million images from the ImageNet database.
  • ResNet-50, a convolutional neural network (CNN) that excels at image classification. ResNet-50 is a highly trained image analyzer that can dissect a picture, identify objects and scenes, and categorize them accordingly.
  • The Densenet121 function, a convenience function that creates a DenseNet-121 network.
  • EfficientNet-b0, which is a convolutional neural network algorithm trained on over a million images from the ImageNet database.
Table 3 shows the settings we applied to all our models. The values are based on our preliminary experiments.

3.3. Performance Evaluation

In classification problems, the overall performance was measured by identifying the success rate, which is the proportion of the correctly classified cases over the entire set of all cases. the results are reported using a weighted F score, accuracy, precision, and recall [28]. The weighted F score is the average of all F scores attained for each class. Each F score was weighted according to the number of instances with that particular class.
Accuracy is one of the most widely reported metrics, especially in classification problems, and it is calculated as follows:
A c c u r a c y = n u m b e r   o f   c o r r e c t l y   c l a s s i f i e d   i n s t a n c e s t o t a l   n u m b e r   o f   i n s t a n c e s
The F score is defined as the harmonic average of precision and recall (A control parameter β can be used to decide how much emphasis to put on precision vs. recall. F1, or, by convention, F, is where β ’s value is 1, denoting an equal/balanced emphasis on both metrics). It is calculated as follows:
F 1 = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
where precision is calculated as follows:
P r e c i s i o n = A A + C
And recall is calculated as follows:
R e c a l l = A A + B
where A represents the number of correct instances classified, B represents the number of correct instances not classified, and C represents the number of incorrect instances classified.

4. Results

This section presents the results of our experiments comparing the performance of various deep-learning models on the FCD dataset collected for the fire-image classification task. The models evaluated include MobileNetV2, InceptionV3, VGG-16, EffecientNetV2l, VGG-19, ResNet-50, and EffecientNet-b0. The primary metrics used for the comparison were accuracy, precision, recall, F1 scores, and inference times.

4.1. Single-Level Classification

This set of experiments was run with two different classification settings. First, it experimented with a flat, single-level, five-class classification structure. Table 4 summarizes the accuracy and F scores achieved with this flat architecture. The MobileNetV2 model attained the best performance with an accuracy score of 94.48%, while EfficientNet-b0 achieved the next best performance with an accuracy score of 93.87%. The lowest performance was recorded with the ResNet-50 model at an accuracy score of 61.35%. Figure 6 shows the accuracy scores attained by the MobileNetV2 in the five-way classification setting, with 300 epochs.
Figure 7 presents the confusion matrix for the MobileNetV2 model applied to the five-way classification task. The per-class precision and recall scores analysis indicated that this model outperformed ResNet50 across all classes. This suggests that certain models may exhibit reduced sensitivity to the number of data instances per class. Instead, the quality of the dataset utilized can significantly influence the accuracy of the results. Given that our FCD dataset comprises 1.3k manually collected and annotated instances, it is contended that its high quality affords substantial potential for advancing research in this domain.

4.2. Two-Level Classification

Second, the work experimented with two levels of nested classifications: binary (fire vs. no fire), followed by a four-way classification of the four different types of classes, including solid-material, chemical-based, electric-based, and oil-based fires. Table 5 shows our experiments’ results using two-level classification. First, the models performed a binary classification to determine whether there was a fire. For this task, it is obvious from the table that the InceptionV3 and Densenet121 attained our best results with an accuracy score of 98.16%, which is a relatively high score due to the simplicity of the task as compared to the second classification level. The literature (Table 1) shows close accuracy scores, although past results are not directly comparable to our findings due to a lack of access to the data used in previous work.
For the second classification level, the models were trained to perform a four-way classification task in order to distinguish between four different types of fire. Specifically, the models were required to identify the type of fire, whether solid-material, electrical, oil-based, or chemical. Table 6 shows the results for this set of experiments, wherein the models—once the presence of fire was determined—classified the type of fire, whether a solid-material, electric-based, oil-based, or chemical fire. The best performance was achieved using the EfficientNet-b0 model, which had an accuracy score of 97.55%. This was followed by the DenseNet121 model, which also had an accuracy score of 95.09%. The lowest performance in this task was attained using EffecientNetV2l, which had an accuracy score of 54.60%.

5. Discussion

Based on the experiments outlined in the previous section, this work recommends following the two-level classification architecture. The reason for that is twofold. First was the potential time and power efficiency that results from refraining from proceeding with other classes once a no-fire class is detected. Second, the two-level classification approach resulted in better accuracy performance, which can be explained by the fact that the task was split into two parts; the models first had to identify whether there was a fire. Then, in the case of a fire being present, the models could proceed to determine the type of fire. The more accurately the type of fire was identified, the more efficiently the response could be elicited from the concerned firefighting department, as it could readily and quickly identify the proper method and tools to put out the fire accordingly. It is also worth noting that the average score for the best-attained results for the binary and four-way classification in this setting was 97.85% accuracy, which was still better than the best scores achieved with the flat, five-way classification at 94.48% accuracy.
Table 7 shows the five-way classification experiments’ per-class precision and recall scores using the ResNet50 model. It can be seen that the no-fire class had the highest precision and recall as compared to the rest of the classes. A possible explanation is that this was one of the biggest classes in our dataset, with more than 500 instances. Another justification for this performance is that the no-fire class is a highly distinguishable class, as the models can easily learn how to detect the presence of such a fire or one of its features, like flame or smoke, compared to normal features with no fire. Similarly, the number of instances, i.e., the class size, clearly impacts the precision and recall rate of the oil-based fire class, which was the smallest class in our dataset. This probably has made it difficult for the models to be able to fetch images wherein the prominent features of oil-based fires, like dark smoke, were present. As such, future work will expand the dataset, which is anticipated to play a role in exposing the models to various features of each fire class, helping make further advances in increasing the per-class precision and recall scores. It is also interesting to see that, in the chemical-fire class, the models seemed to struggle to identify the instances in this class, which was reflected by the lowest precision rate in the table at 0.37, while attaining a reasonable recall rate at 0.73, probably reflecting a relatively more straightforward task for the models to identify the presence of a chemical fire instance merely. Overall, augmenting the dataset with more high-quality instances is anticipated to impact the performance of the trained models positively.
As for the issue of unbalanced classes, as shown in Table 2, it is interesting to observe the positive impact on the accuracy of detecting the no-fire class as being the class with the most significant number of instances in the dataset. The relatively large size of this class is also clearly reflected in the precision and recall rate of the no-fire class. However, while a factor with a significant influence, class size is not the only factor that affects it. Specifically, the number of instances in the chemical-fire, electric-fire, and oil-based fire classes is relatively closed, and yet, the results of the four-way classification in Table 6 reveal no significant impact on the overall performance, with the best accuracy attained at 97.55%. This is probably due to the quality of the instances in the dataset, as it was manually annotated. The issue of data quality against quality has been thoroughly investigated in the literature [29]. To further investigate the impact of balanced vs. unbalanced classes, and following previous work, several preliminary experiments were conducted using over-sampling and under-sampling of the training data [30]. The results revealed a slight improvement with the single-level, five-way classification and a marginal impact to no impact on the two-level classification. As such, we opted to report all findings in this work using the original class distribution. In addition, we reported the results using both accuracy and F scores. The F score was recommended for reporting the findings of classification problems [30], especially with unbalanced classes.
Table 8 shows the performance of the randomly selected images from each class. The confidence value in the table denotes the probability or likelihood of the model attributing a specific class to a provided image. Typically, this value is represented as a number between 0 and 1, whereas high confidence (close to 1) is when the model is highly confident that an image corresponds to a particular class. Low confidence, on the other hand (close to 0), is where the model exhibits little certainty that the image belongs to that class. The table indicates that the models were primarily able to detect the correct class with a level of confidence varying from 1, like in MobileNetV2, InceptionV3, and EffecientNet-b0 and ranging to a lower degree of confidence, as in EffecientNetV2L. It is also worth mentioning that there were some cases wherein the models were not able to detect the image class, as in EffecientNetV2L and ResNet-50, which is aligned with the relatively poor performance attained via these models (see Table 6).

6. Conclusions

The early and influential reaction to fires as they occur, based on their identified class, can have a significant environmental impact. The ability to quickly and efficiently detect the presence of a fire and determine the class or type of fire can significantly reduce possible losses and damages due to fires. As such, this research area has attracted researchers from different disciplines to unite their efforts towards achieving this goal. This work attempted to advance the existing literature by expanding the scope to include not merely the detection of a fire’s existence but also the ability to automatically classify the type of fire, which is vital in determining the method and tools needed to put out a fire. It has introduced a multiclass dataset based on the kinds of origins of fires.
This study has proposed a system employing strategically positioned fire-detection sensors integrated with pre-trained deep-learning models, such as DenseNet121, to facilitate the rapid identification and localization of fire incidents, thereby enabling the prompt deployment of suppression resources. This work contributes a dataset comprising over 1300 manually annotated images and a series of experiments utilizing eight AI-based deep-learning algorithms to enhance the automatic detection of fires and classify the fire types. Our system achieved its best performance with an accuracy score of 98% in automatically detecting fires and an accuracy score of 97.55% in classifying fire types using the DenseNet121 and EffecientNet-b0 models, respectively. Future advancements can be achieved by expanding the existing dataset, which can serve as a testbed for developing and evaluating new systems.

Author Contributions

The contributions to this work were as follows: conceptualization, E.A.R., A.S. and B.A.; methodology, E.A.R., A.S. and B.A.; software, A.S.; validation, A.S.; formal analysis, E.A.R., A.S. and B.A.; investigation, E.A.R., A.S. and B.A.; resources, E.A.R., A.S. and B.A.; data curation, E.A.R., A.S. and B.A.; writing—original draft preparation, E.A.R.; writing—review and editing, E.A.R. and B.A.; visualization, A.S.; supervision, B.A.; project administration, B.A.; and funding acquisition, E.A.R. All authors have read and agreed to the published version of the manuscript.

Funding

The APC of this research was funded by the Deanship of Graduate Studies and Scientific Research, Jazan University, Saudi Arabia, through project number GSSRD-24.

Data Availability Statement

This study’s research data are accessible to the research community to conduct further research [25].

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Prakash, P. Predicting Cyclone Michaung’s Wrath: Leveraging Deep Learning Technique for Intensity and Track Forecasting. Int. J. Multidiscip. Res. 2024. [Google Scholar] [CrossRef]
  2. Refaee, E.A. Using Machine Learning for Performance Classification and Early Fault Detection in Solar Systems. Math. Probl. Eng. 2022, 2022, 6447434. [Google Scholar] [CrossRef]
  3. Abdelhag, M.E.; Ali, S.E.E.; Amin, S.T.; Ali, A.; Khan, A.A.M. Machine Learning Based Model For Solar Radiation Prediction. Mach. Learn. 2022, 54, 2096–3246. [Google Scholar]
  4. Alhazmi, A.; Alhazmi, Y.; Makrami, A.; Masmali, A.; Salawi, N.; Masmali, K.; Patil, S. Application of artificial intelligence and machine learning for prediction of oral cancer risk. J. Oral Pathol. Med. 2021, 50, 444–450. [Google Scholar] [CrossRef] [PubMed]
  5. Alsharif, A.; Banoqitah, E.; Alnowimi, M.; Morfeq, A. Assessment of Radiation Exposure, Safety Practice, and Awareness Among Healthcare Providers Utilizing CT Units in Jazan Area Hospitals. J. King Abdulaziz Univ. Eng. Sci. 2024, 34, 89. [Google Scholar]
  6. National-Fire-Protection-Association. US Fire Adminstration Stats. 2024. Available online: https://www.usfa.fema.gov/statistics/ (accessed on 19 July 2024).
  7. Stephens, S.L. Forest fire causes and extent on United States Forest Service lands. Int. J. Wildland Fire 2005, 14, 213–222. [Google Scholar] [CrossRef]
  8. Rie, D.H.; Lee, J.W.; Kim, S. Class B fire-extinguishing performance evaluation of a compressed air foam system at different air-to-aqueous foam solution mixing ratios. Appl. Sci. 2016, 6, 191. [Google Scholar] [CrossRef]
  9. FireSafe. Fire Extinguishers—Classes, Colour Coding, Rating, Location and Maintenance. 2011. Available online: https://www.firesafe.org.uk/portable-fire-extinguisher-general/ (accessed on 19 June 2023).
  10. binti Zaidi, N.I.; binti Lokman, N.A.A.; bin Daud, M.R.; Achmad, H.; Chia, K.A. Fire recognition using RGB and YCbCr color space. ARPN J. Eng. Appl. Sci. 2015, 10, 9786–9790. [Google Scholar]
  11. Toulouse, T.; Rossi, L.; Celik, T.; Akhloufi, M. Automatic fire pixel detection using image processing: A comparative analysis of rule-based and machine learning-based methods. Signal Image Video Process. 2016, 10, 647–654. [Google Scholar] [CrossRef]
  12. Lall, S.; Mathibela, B. The application of artificial neural networks for wildfire risk prediction. In Proceedings of the 2016 International Conference on Robotics and Automation for Humanitarian Applications (RAHA), Amritapuri, India, 18–20 December 2016; IEEE: New York, NY, USA, 2016; pp. 1–6. [Google Scholar]
  13. Chen, Y.; Xu, W.; Zuo, J.; Yang, K. The fire recognition algorithm using dynamic feature fusion and IV-SVM classifier. Clust. Comput. 2019, 22, 7665–7675. [Google Scholar] [CrossRef]
  14. Pham, B.T.; Jaafari, A.; Avand, M.; Al-Ansari, N.; Dinh Du, T.; Yen, H.P.H.; Phong, T.V.; Nguyen, D.H.; Le, H.V.; Mafi-Gholami, D.; et al. Performance evaluation of machine learning methods for forest fire modeling and prediction. Symmetry 2020, 12, 1022. [Google Scholar] [CrossRef]
  15. Jain, P.; Coogan, S.C.; Subramanian, S.G.; Crowley, M.; Taylor, S.; Flannigan, M.D. A review of machine learning applications in wildfire science and management. Environ. Rev. 2020, 28, 478–505. [Google Scholar] [CrossRef]
  16. Ismail, F.B.; Al-Bazi, A.; Al-Hadeethi, R.H.; Victor, M. A Machine Learning Approach for Fire-Fighting Detection in the Power Industry. Jordan J. Mech. Ind. Eng. 2021, 15, 475–482. [Google Scholar]
  17. Giandi, O.; Sarno, R. Prototype of fire symptom detection system. In Proceedings of the 2018 International Conference on Information and Communications Technology (ICOIACT), Yogyakarta, Indonesia, 6–7 March 2018; IEEE: New York, NY, USA, 2018; pp. 489–494. [Google Scholar]
  18. Yuan, L.; Thomas, R.A.; Rowland, J.H.; Zhou, L. Early fire detection for underground diesel fuel storage areas. Process Saf. Environ. Prot. 2018, 119, 69–74. [Google Scholar] [CrossRef] [PubMed]
  19. Naser, M. Mechanistically informed machine learning and artificial intelligence in fire engineering and sciences. Fire Technol. 2021, 57, 2741–2784. [Google Scholar] [CrossRef]
  20. Elmaz, F.; Büyükçakır, B.; Yücel, Ö.; Mutlu, A.Y. Classification of solid fuels with machine learning. Fuel 2020, 266, 117066. [Google Scholar] [CrossRef]
  21. Zohdi, T. A digital twin framework for machine learning optimization of aerial fire fighting and pilot safety. Comput. Methods Appl. Mech. Eng. 2021, 373, 113446. [Google Scholar] [CrossRef]
  22. Jashnani, K.; Kaul, R.; Haldi, A.; Nimkar, A.V. Computer Vision Based Mechanism for Detecting Fire and Its Classes. In Proceedings of the Computer Vision and Image Processing, Nagpur, India, 4–6 November 2023; Gupta, D., Bhurchandi, K., Murala, S., Raman, B., Kumar, S., Eds.; Springer: Cham, Switzerland, 2023; pp. 538–553. [Google Scholar]
  23. Kumar, A. Fire Detection Dataset. 2019. Available online: https://www.kaggle.com/datasets/atulyakumar98/test-dataset (accessed on 19 July 2023).
  24. El-Madafri, I.; andNoelia Olmedo-Torre, M.P. Wildfire Dataset. 2024. Available online: https://www.kaggle.com/datasets/elmadafri/the-wildfire-dataset (accessed on 13 January 2024).
  25. Refaee, E.; Shenemar, A.; Assiri, B. Fire Type Classes Dataset. 2024. Available online: https://zenodo.org/records/13119922?preview=1&token=eyJhbGciOiJIUzUxMiJ9.eyJpZCI6ImI0NjA1OWRlLWM1ZjctNDM0NS04Mjk0LTA1OGViNzA1Zjg2MCIsImRhdGEiOnt9LCJyYW5kb20iOiIyZDE3NjBlMTQ4YzNmZDQ0MmQzMTZiODgzZThmMzk5OCJ9.q7irvvmF2Ecs6u1hSOlz8Ny2QgHp1FksziQCFCrgcawqkwrDi47vVkRmksXMuLwNpkz0pi4AtzCA489aAR76Gg, (accessed on 19 August 2024).
  26. BBC. Huge Fire Erupts at Oil Refinery in IRAQ. 2024. Available online: https://www.bbc.com/news/videos/cerrk983xego (accessed on 13 June 2024).
  27. Bakersfield, C. KCFD Crews Extinguish Blue Flames from Chemical Fire in Shafter. 2023. Available online: https://bakersfieldnow.com/news/local/kcfd-crews-extinguish-blue-flames-from-chemical-fire-in-shafter-vegetation-sulfur-kern-county-department-agricultural (accessed on 13 February 2024).
  28. Witten, I.H.; Frank, E.; Hall, M.A. Data Mining: Practical Machine Learning Tools and Techniques, 2nd ed.; Morgan Kaufmann: Burlington, MA, USA, 2013; pp. 213–222. [Google Scholar]
  29. Sadiq, S.; Indulska, M. Open data: Quality over quantity. Int. J. Inf. Manag. 2017, 37, 150–154. [Google Scholar] [CrossRef]
  30. Mohammed, R.; Rawashdeh, J.; Abdullah, M. Machine learning with oversampling and undersampling techniques: Overview study and experimental results. In Proceedings of the 2020 11th International Conference on Information and Communication Systems (ICICS), Irbid, Jordan, 7–9 April 2020; IEEE: New York, NY, USA, 2020; pp. 243–248. [Google Scholar]
Figure 1. This is a sample image of an oil fire taken from the BBC news website of a real-life incident of an oil-refinery fire.
Figure 1. This is a sample image of an oil fire taken from the BBC news website of a real-life incident of an oil-refinery fire.
Applsci 14 07862 g001
Figure 2. This is a sample image of a chemical fire with the blue flame as a distinguishing feature.
Figure 2. This is a sample image of a chemical fire with the blue flame as a distinguishing feature.
Applsci 14 07862 g002
Figure 3. These are sample images illustrating the difference between images in the no-fire class before and after pre-processing.
Figure 3. These are sample images illustrating the difference between images in the no-fire class before and after pre-processing.
Applsci 14 07862 g003
Figure 4. These are sample images illustrating the difference between images in the oil-based fires class before and after pre-processing.
Figure 4. These are sample images illustrating the difference between images in the oil-based fires class before and after pre-processing.
Applsci 14 07862 g004
Figure 5. The general fire algorithm.
Figure 5. The general fire algorithm.
Applsci 14 07862 g005
Figure 6. The accuracy score of MobileNetV2 in the five-way classification with up to 300 epochs.
Figure 6. The accuracy score of MobileNetV2 in the five-way classification with up to 300 epochs.
Applsci 14 07862 g006
Figure 7. The confusion matrix of MobileNetV2 on the flat, five-way classification.
Figure 7. The confusion matrix of MobileNetV2 on the flat, five-way classification.
Applsci 14 07862 g007
Table 1. Summary of previous work on image-based fire detection and classification.
Table 1. Summary of previous work on image-based fire detection and classification.
WorkDatasetApproachType of ClassificationResults
[10]100 imagesMATLAB and RGB & YCbCrBinary (fire vs no fire)90–100% acc.
[11]500 wildfire imagesRule-based, LR, and SVMBinary (fire vs. no fire)0.91 F
[12]Historical data for Cape TownANNLow, moderate, high, and extreme0.97 Acc.
[18]Real-time sensor-basedThree types of sensorsFire existencenot mentioned
[14]57 recorded firesNB, DT, and MLRBinary (fire vs. no fire)AUC value: 0.96
[17]Real-time sensor-basedFuzzy logicDetecting fire symptomsnot mentioned
[19]Historical fires, sensors, and simulationSVM, DT, ANFIS, and ANNBinary (fire vs. no fire)98% accuracy
Table 2. Number of data instances in our FCD dataset.
Table 2. Number of data instances in our FCD dataset.
Fire ClassNo. of Instances
No fire541
Wood and solid-material fire308
Flammable-gas and chemical-liquids fire163
Electric-based fire187
Oil-based fire154
Total1353
Table 3. Values of our empirical configuration setup.
Table 3. Values of our empirical configuration setup.
Experimental Config.Value
Image size256 × 256
Batch size1400
Learning rate0.001
Epoch300
Loss functionBinary and categorical cross-entropy
Table 4. The results of using eight different classification algorithms for the five-way classification of the no-fire vs. electrical-fire vs. oil-based fire vs. solid-material fire vs. chemical-fire classes. Bold values denote the best performance in each column.
Table 4. The results of using eight different classification algorithms for the five-way classification of the no-fire vs. electrical-fire vs. oil-based fire vs. solid-material fire vs. chemical-fire classes. Bold values denote the best performance in each column.
Classification AlgorithmAccuracyAvg. F1 Score
MobileNetV294.48%0.944
InceptionV390.18%0.901
VGG-1685.89%0.861
EfficientNetV2l61.96 %0.622
VGG-1980.98%0.813
ResNet-5061.35%0.622
Densenet12193.25%0.934
EfficientNet-b093.87%0.939
Table 5. The results of using eight different classification algorithms for the binary classification of the fire vs. no-fire classes. Bold values denote the best performance in each column.
Table 5. The results of using eight different classification algorithms for the binary classification of the fire vs. no-fire classes. Bold values denote the best performance in each column.
Classification AlgorithmAccuracyAvg. F1 Score
MobileNetV296.32%0.963
InceptionV398.16%0.981
VGG-1693.87%0.938
EfficientNetV2l71.78%0.718
VGG-1987.73%0.877
ResNet-5079.14%0.790
Densenet12198.16%0.981
EfficientNet-b096.32%0.963
Table 6. The results of using eight different classification algorithms for the four-way classification of solid-material fire vs. electric-based fire vs. oil-based fire vs. chemical fire. Bold values denote the best performance in each column.
Table 6. The results of using eight different classification algorithms for the four-way classification of solid-material fire vs. electric-based fire vs. oil-based fire vs. chemical fire. Bold values denote the best performance in each column.
Classification AlgorithmAccuracyAvg. F1 Score
MobileNetV293.87%0.938
InceptionV391.41%0.914
VGG-1690.18%0.902
EfficientNetV2l54.60%0.557
VGG-1988.96%0.889
ResNet-5069.33%0.695
Densenet12195.09%0.951
EfficientNet-b097.55%0.975
Table 7. The per-class precision and recall scores for the five-way classification of the no-fire vs. electrical-fire vs. oil-based-fire vs. solid-material-fire vs. chemical-fire classes (ResNet50). Bold values denote the best performance in each column.
Table 7. The per-class precision and recall scores for the five-way classification of the no-fire vs. electrical-fire vs. oil-based-fire vs. solid-material-fire vs. chemical-fire classes (ResNet50). Bold values denote the best performance in each column.
Fire ClassPrecisionRecall
No fire0.8910.754
Solid material0.6810.405
Chemical fire0.3780.736
Electrical-based fire0.4540.625
Oil-based fire0.4370.389
Avg.0.6690.613
Table 8. Randomly selected images from our dataset with their predicted labels and accuracy for the different algorithms. The numbers below denote the degree of confidence with which a model detected an image. ND stands for not detected.
Table 8. Randomly selected images from our dataset with their predicted labels and accuracy for the different algorithms. The numbers below denote the degree of confidence with which a model detected an image. ND stands for not detected.
ImageMobileNetV2InceptionV3VGG-16EffecientNetV2LVGG-19ResNet-50DenseNet21EfficientNet-b0
Applsci 14 07862 i0011.001.000.890.360.980.581.001.00
Applsci 14 07862 i0021.001.000.94ND0.66ND1.001.00
Applsci 14 07862 i0031.001.000.840.370.910.371.001.00
Applsci 14 07862 i0041.001.00NDND0.77ND1.001.00
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Refaee, E.A.; Sheneamer, A.; Assiri, B. A Deep-Learning-Based Approach to the Classification of Fire Types. Appl. Sci. 2024, 14, 7862. https://doi.org/10.3390/app14177862

AMA Style

Refaee EA, Sheneamer A, Assiri B. A Deep-Learning-Based Approach to the Classification of Fire Types. Applied Sciences. 2024; 14(17):7862. https://doi.org/10.3390/app14177862

Chicago/Turabian Style

Refaee, Eshrag Ali, Abdullah Sheneamer, and Basem Assiri. 2024. "A Deep-Learning-Based Approach to the Classification of Fire Types" Applied Sciences 14, no. 17: 7862. https://doi.org/10.3390/app14177862

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop