Next Article in Journal
Variable-Rate Irrigation in Diversified Vegetable Crops: System Development and Evaluation
Previous Article in Journal
Advancing Leaf Nutritional Characterization of Blueberry Varieties Adapted to Warm Climates Enhanced by Proximal Sensing
Previous Article in Special Issue
Rapid Analysis of Soil Organic Carbon in Agricultural Lands: Potential of Integrated Image Processing and Infrared Spectroscopy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Computational Techniques for Analysis of Thermal Images of Pigs and Characterization of Heat Stress in the Rearing Environment

by
Maria de Fátima Araújo Alves
1,
Héliton Pandorfi
1,
Rodrigo Gabriel Ferreira Soares
2,
Gledson Luiz Pontes de Almeida
1,
Taize Calvacante Santana
1 and
Marcos Vinícius da Silva
3,*
1
Department of Agricultural Engineering, Federal Rural University of Pernambuco, Recife 52171-900, PE, Brazil
2
Department of Statistics and Computer Science, Federal Rural University of Pernambuco, Recife 52171-900, PE, Brazil
3
Programa de Pós-Graduação em Ciências Florestais, Universidade Federal de Campina Grande, Av. Universitária, s/n, Santa Cecília, Patos 58708-110, PB, Brazil
*
Author to whom correspondence should be addressed.
AgriEngineering 2024, 6(3), 3203-3226; https://doi.org/10.3390/agriengineering6030183
Submission received: 29 July 2024 / Revised: 22 August 2024 / Accepted: 2 September 2024 / Published: 6 September 2024

Abstract

:
Heat stress stands out as one of the main elements linked to concerns related to animal thermal comfort. This research aims to develop a sequential methodology for the extraction of automatic characteristics from thermal images and the classification of heat stress in pigs by means of machine learning. Infrared images were obtained from 18 pigs housed in air-conditioned and non-air-conditioned pens. The image analysis consisted of its pre-processing, followed by color segmentation to isolate the region of interest and later the extraction of the animal’s surface temperatures, from a developed algorithm and later the recognition of the comfort pattern through machine learning. The results indicated that the automated color segmentation method was able to identify the region of interest with an average accuracy of 88% and the temperature extraction differed from the Therma Cam program by 0.82 °C. Using a Vector Support Machine (SVM), the research achieved an accuracy rate of 80% in the automatic classification of pigs in comfort and thermal discomfort, with an accuracy of 91%, indicating that the proposal has the potential to monitor and evaluate the thermal comfort of pigs effectively.

1. Introduction

Heat stress stands out as one of the main elements linked to concerns related to animal welfare [1]. According to Alves et al. [2] high temperatures of the rearing environment induce heat stress in the animals, resulting in adverse effects that affect their physiological and behavioral responses and food intake. Cai et al. [3] argue that the body temperature of animals is susceptible to the adverse effects of meteorological elements, such as air temperature, relative humidity and solar radiation, which can cause variations in the animal’s body temperature. In pigs, heat stress results from an imbalance in thermal regulation, in which the animal generates more heat than it can dissipate into the environment. This can lead to reductions in feed intake, growth, feed efficiency and reproductive rate [4].
Physiological responses, such as respiratory rate and internal body temperature, are indicators of health, thermal comfort and, consequently, productive efficiency of the animals. However, the collection of these variables is traditionally invasive, manual and visual, which makes it a laborious and stressful process for the animals and currently no longer serves modern livestock [5].
The surface temperature of pigs is also an indicator of their thermal comfort state [6]. Monitoring the body temperature of animals allows the identification of individuals that are in heat stress or infected, contributing to the mitigation prevention of stress, through air conditioning systems or the spread of diseases by isolating affected pigs [7].
The variation in the body surface temperature of the animals is manifested directly, from peripheral vasodilation in the active tissues, as a response to high air temperature. Changes due to infectious processes also impact the temperature of internal tissues, causing variations in blood flow and releasing energy in the form of infrared radiation. These changes are evidenced through an uneven distribution of temperature between different body regions [8]. For this reason, according to Cai et al. [3], it is possible to measure body surface temperature using an infrared-sensitive thermal camera.
In the face of the increase in environmental temperature, pigs adjust their body temperatures by transferring energy from the body to the environment through sensitive (conduction, convection and radiation) and latent (evaporation) means, with the activation of physiological mechanisms of sweating and tachypnea. In addition, they expand the superficial blood vessels to facilitate heat loss, and the effect of vasodilation helps in the dissipation of energy by irradiation, with an increase in the temperature of the skin surface, captured by infrared thermography [5].
Thermal imaging represents a non-invasive and non-destructive method for measuring the temperature of an object. This process uses irradiation in the infrared spectrum emitted by the object to form a visual profile of the temperature in the captured scene, as discussed by Wilson et al. [9]. Inside the infrared thermal imager is a non-contact infrared temperature sensor, capable of detecting the infrared radiation emitted by the surface of the object, as mentioned by Godyn et al. [10].
The object’s surface temperature data are converted into a grayscale image or a color image through the internal signal processing system after the infrared thermal imager picks up the infrared radiation. The higher the intensity of the infrared radiation coming from the object analyzed, the higher the gray level or color intensity in the thermal image, as observed in the study of He et al. [11].
Thermal imaging technology in pig farming has attracted attention in recent years due to its potential. The study by Kadirvel et al. [12] used infrared thermal cameras to assess the body temperature of pigs. This technology has been explored for various purposes, such as detecting coughs in pigs under field conditions [13], obtaining 3D point cloud data for pig segmentation [14,15], and non-invasively assessing the internal body temperature of neonatal piglets [15].
The growing use of these non-invasive technologies has stood out as an alternative for monitoring the body temperature of animals [16]. Infrared thermography (IRT) tends to be a promising non-invasive technology and has the potential to reduce the stress response in animals associated with contact temperature measurement, such as rectal temperature measurement, which, in turn, can cause the transfer of bacteria and other pathogens to the animal. It is being progressively incorporated into both research and practical applications for measuring the temperature in animals [3,17].
IRT not only serves to automate the monitoring of the surface temperature of pigs but can also be used to extract the body shape of the animals by segmenting their contour, making it possible to evaluate the stress conditions in the region of interest via computer vision techniques [7].
In computer vision, image segmentation technology plays an important role in diagnostic systems, aided by Singh et al. [18]. It should be noted that detection techniques using IRT can be improved by applying pre- and post-processing algorithms to the thermographic data to eliminate unwanted interferences, ensuring accurate and reliable diagnoses [19].
Rodrigues et al. [5] developed a method called Thermal Signature, to extract characteristics from the data obtained by IRT and employ them as input attributes in machine learning-based models to evaluate heat stress in calves and achieved an accuracy of 83.29% in the thermal classification of comfort, alert, danger and emergency using the temperature of the eye region obtained from the thermal image. Mcmanus et al. [20] recommend the use of IRT as an aid tool for the recognition of infectious states in production animals.
Support Vector Machine (SVM) algorithms have been applied in conjunction with thermal imaging data for various purposes in pig farming. These algorithms have been used for behavior classification, such as cough detection [14,21], as well as for measuring temperature in groups of pigs using thermal sensors integrated into smartphones [22,23]. Additionally, SVMs have been employed to distinguish between pigs in different states of stress [23].
Due to its non-invasive nature, Whittake et al. [24] argue that IRT emerges as an effective alternative in pain assessment, being used to detect acute processes in inflammatory states, such as osteoarthritis, and during routine surgical procedures, such as castration, tail docking and sprouting, in which analgesic drugs are not always administered to animals.
In the analysis of thermal images of pigs, one of the fundamental objectives is to predict the thermal comfort of the animals. This involves identifying temperature patterns that indicate whether pigs are in conditions of comfort or heat stress. This prediction is important to ensure the well-being and adequate performance of animals in animal production, as thermal comfort directly influences their health and productivity. In this context, this study was conducted with the objective of developing a sequential methodology for the extraction of automatic characteristics from thermal images and classification of heat stress of pigs through machine learning.

2. Materials and Methods

2.1. Location and Characterization of the Study Area

The monitoring was carried out in the animal research vivarium located at the Serra Talhada Academic Unit (UAST) of the Federal Rural University of Pernambuco (UFRPE) with pigs in the growing and finishing phases.
The climate of the study area, according to Reis et al. [25], is of the BSh type. This type of climate is called semi-arid, hot and dry. The authors of [26] add that the average annual precipitation is 642 mm, the temperatures range from 20.1 to 32.9 °C and the average relative humidity of the air is 63%.

2.2. Data Collection

Data collection took place over 92 days, in the period between August and December 2017. The experiment was conducted in accordance with ethical guidelines and was approved by the Ethics Committee on the Use of Animals of the Federal Rural University of Pernambuco (CEUA/UFRPE). The approval protocol used was No. 23082.021090/2016-81, ensuring that all the necessary measures for the welfare and care of the animals were followed during the research.
Eighteen pigs were used, including males and females, distributed in groups of three across the pens. The animals were selected from a group available in the university’s experimental animal facility, from the commercial lineage of ¾ Duroc and ¼ Pietrain.
Six stalls were used, three with an air conditioning system, equipped with an evaporative adiabatic cooling system and three without any air conditioning system, allowing the analysis of the animals under natural ventilation conditions. The air conditioners used in the bays had a flow rate of 3 L/h, with independent motors and a propeller rotating at a speed of 1750 RPM and a central disc at 3450 RPM.

2.2.1. Recording of Thermal Images

Although each pen contained three animals, data were collected from only two animals to simplify the process and overcome logistical challenges. This was due to the fact that the animals were mostly in motion, which made data collection more complex and required additional personnel to ensure accurate identification. To facilitate identification in the thermal images, special brushes were used to mark each animal distinctly. This marking helped the observer to identify which animal was being recorded, ensuring the correct association between the animals and the captured images. Additionally, to ensure proper separation of the images, the observer used a closed hand gesture to separate the images of the animals and an open hand gesture to distinguish between the climatized and non-climatized pens.
The images were obtained using a FLIR E60 infrared thermal imager. The thermal imager has a temperature scale with thermal sensitivity ranging from −20 to 120 °C, with a reading accuracy of ±2 °C, IR resolution of 320 × 240 pixels and the software used to download the images was FLIR Tools that was compatible with the equipment.
The images were recorded from a distance of 1 m between the thermal camera and the animal. This distance was sufficient to frame the entire body of the animal. Recordings were made weekly at three fixed times: 08:00, 12:00, and 16:00. These captures were taken regardless of the animals’ positions, as long as they included the full body of the animals. A total of 486 images were acquired; however, only those where the animals could be distinctly identified were selected, as thermal images can be less clear when animals are close to each other. This resulted in a total of 226 thermal images, with 113 from the climatized environment and 113 from the non-climatized environment, each with dimensions of 320 × 240 pixels, which were subsequently exported to “.jpg” format.
The images were adjusted based on local atmospheric conditions, considering the temperature and relative humidity of the environment. In addition, the emissivity was also corrected to 0.98 which according to Tucker et al. [17], is ideal for biological tissues and according to Wang et al. [7], is used in more than 60% of the research. Corrections were made manually in FLIR Tools software version 5.13.

2.2.2. Recording of Environmental Variables

Air temperature (T, °C), relative humidity (RH, in %), black globe temperature (bgt, °C) and wind speed (Ws, in m/s) were recorded during the 92 days of monitoring. The records of air temperature, relative humidity and black globe temperature were performed using a U12-012 (Onset Computer Corporation Bourne, MA, USA) datalogger of the HOBO® brand, during the day and night. The positioning of the instrument followed the recommendations of Barbosa Filho et al. [27], which suggest that the equipment be installed preferably in the central region of the shed.
The data collection regarding wind speed (Ws, in m/s) was carried out using a digital thermo-anemometer, model TAFR-180, with a scale of 0.1 to 20.0 m/s and a resolution of 0.1 m/s, at a distance of 1.50 m from the ground.
These variables were collected inside the stalls, using a datalogger and in the external environment, and using a meteorological shelter. Both pieces of equipment were positioned at a height of 1.50 m, in order to obtain representative information of the variations between the treatments.

2.2.3. Tool Used

This study proposed a method of automatic feature extraction and classification of thermal images in pigs. The method was developed in four stages: pre-processing, image segmentation, feature extraction, and pattern recognition and interpretation, based on the proposal of Wang et al. [7]. The proposed algorithm is presented in the flowchart (Figure 1).

Preprocessing

The image pre-processing steps were carried out in order to remove noise that refers to unwanted interference or distortions in the images that can be caused by various factors, such as failures in the image acquisition equipment, adverse environmental conditions, or imperfections in the transmission or storage processes, resulting from the image acquisition stage. These steps consisted of converting the original image to the HSV (hue, saturation and value) color space. The HSV color space separates the color, saturation, and value of an image. The H-channel (hue) of the HSV image was isolated to preserve information related to hue (color), and filters were applied to smooth the image and remove noise, improving the quality and accuracy of the information.
For analysis purposes, the median and two-sided filter of the open-source library Open Source Computer Vision, developed by Intel Corporation, based in Santa Clara, California, USA, was applied. The evaluation of the performance of the filters took place through the application of the Structural Similarity Index Measure (SSIM), which is a metric used to evaluate the quality of an image, including the preservation of details, contrast and sharpness.
The SSIM ranges from −1 to 1. A value of +1 indicates that the two images, the auto-segmented and the reference-segmented image, are truly very similar, while a value of −1 indicates that the images are distinct [18]. Thus, the higher the SSIM value, the more similar the image is to the reference or the original. This procedure followed the methodology adopted in the research of Kalaiyarasi et al. [28] which analyzed the results of the median filter in medical images using SSIM. After the SSIM analysis, the type of filter with the highest value was selected.
To improve contrast, histogram equalization was applied, where a histogram represents the distribution of pixel intensities in an image, and equalization involves the redistribution of these intensities to achieve a more uniform distribution. This process is particularly useful for enhancing details in shadow areas and highlighting important features. By calculating the cumulative distribution function, histogram equalization transforms the image, increasing sharpness and improving contrast overall.

Segmentation

The segmentation of the region of interest was performed by two methods: the segmentation method proposed by Otsu [29] and the color targeting method. In Otsu’s method, the filtered images were transformed into grayscale, with intensities ranging from 0 to 255, in which the threshold technique was applied, where an image threshold was automatically determined based on the histogram to eliminate the background of the images.
Erosion and dilation techniques were used to adjust the result of image segmentation, in which a morphological closure operator was used with a structuring element of the area of interest [30]. The adjustments were developed by first applying erosion to reduce the segmented areas and remove small noise, followed by dilation to expand the segmented regions, filling any gaps and connecting adjacent areas, using a Colab notebook.
Such procedures were developed according to the proposal of Bareli [31]. The resulting threshold was inverted so that the region of interest appeared white (255) and the background black (0). This process ensured that only the pixels associated with the targeted animals remained visible, while the background pixels were suppressed by assigning them a value of zero.
Then, the outlines of the binarized animals were identified in the image based on the application of OpenCV’s findContours algorithm and were drawn over the grayscale image.
After these steps, a blank mask was generated and a binarization was added with an automatically found threshold of 25, where the filtered contours were added, highlighting the areas of interest.
The color segmentation technique allowed the pigs represented by color to be separated from the hottest pixels in the image, using the HSV (hue, saturation, value) color space [31]. Thus, the HSV space gathered the information regarding color (hue) in a single channel, to generate vectors containing the values referring to the lower limit (darker colors) and upper limit (white color).
For the segmentation by color, the proposal of Bareli [31] that considers segmentation based on HSV space was used. Table 1 shows the intervals for the colors yellow, blue, green, and red. For example, the yellow range spans the hue scale (H) from 10 to 50, representing the characteristic range of that color. The S-values are used to limit saturation, ranging from 100 to 255, while the V value is the luminance and defines the brightness range, ranging from 100 to 255.
After conversion to the HSV color space, masks were extracted, each designed to encompass a specific color range, with the exception of the cool shades, according to the scale of the images, which included the manual annotation of the minimum and maximum values of each one, providing the basis for the generation of the temperature range and the color palette of the ThermaCam software (Version 2.10) (Table 2).
This process aimed to identify the colors of the warmest pixels in the image. The segmentation stage was carried out with the exclusion of all pixels that did not belong to the region of interest, because in this way all the cold pixels of the image were zeroed.
Subsequently, all individual masks were unified, thus highlighting the areas of interest in the image. In addition, a supplementary mask was generated for the contours of the animals and the treatment of possible failures, according to the procedure previously described.
To validate an image segmentation method, it is necessary to compare automatically segmented images with manually segmented images (reference). In this sense, specific reference segmentations were developed for this study, which were established as a standard. To perform the reference segmentations, the images were manually annotated on the Online Platform VGG Image Annotator (Version 2.0.11) delimiting the contour of the region of interest in a free way and with this, a file with the coordinates of the animals’ edges was obtained. Using the OpenCV library and language Python na IDE Pycharm (Version 2023.2.2), the masks were generated and converted into binary images. The region of interest and the background of the image were identified through the colors white and black, corresponding to pixel values of 255 and 0, respectively. Subsequently, an image was cropped using the contour mask and OpenCV binary operators (CV2.bitwise_and).
To evaluate the results of pig segmentation in the thermographic images, evaluation metrics based on the Jaccard similarity index (Equation (1)), the Dice coefficient (Equation (2)) and the precision (Equation (3)) were introduced, proposed by Zhang et al. [32].
Index   of   Jaccard = TP TP + TN + FN
Coefficient   of   Dice = 2 × TP 2 × TP + FP + FN
Precision   ( P ) = TP TP + FP
The Jaccard Index is a metric that quantifies the similarity between two sets and in image segmentation, it is used to evaluate how well the automatically segmented image overlaps with the manually segmented image. The closer the Jaccard Index value is to 1, the greater the overlap between the two images and the more accurate the segmentation.
The Dice coefficient is also a similarity metric used to assess the overlap or agreement between two sets. It ranges from 0 to 1, where 0 indicates no overlap and 1 indicates a full agreement overlap. The closer to 1, the better the algorithm’s performance against the referral targeting.
Accuracy was calculated by considering the proportion of pixels identified by the algorithm compared to the number of pixels found in manual targeting. Accuracies closer to 1 indicate the effectiveness of the segmentation method in identifying these elements, while accuracies farther away from 1 signal a lower performance in segmentation.
The metrics were evaluated considering true positives (PV), true negatives (VN), false positives (FP) and false negatives (FN). The PVs correspond to the pixels of the animals that have been precisely targeted by the targeting method, in accordance with manual targeting. The VN are the pixels that represent the background of the image.
FPs include pixels that were incorrectly included in the segmentation as part of the pig, but are not, contrary to manual segmentation. These are the pixels in the background of the image that were mistakenly considered to be part of the pig by the segmentation. Finally, the FN are the pixels of the pig contours and the pig area that were not identified by segmentation, in contrast to manual segmentation. They represent parts of the pig that the targeting method could not correctly identify.

Feature Extraction

The extraction of data from the samples was from the previously segmented region of interest and consisted of the surface temperature values of the animals’ skin. To extract the temperature values from the thermal images, a routine was developed using as input data, the segmented images, the range of temperature variation expressed in the image scales and the pixel information of the rainbow color palette of the FLIR Tools software (Version 5.13) also used in the ThermaCam software (Version 2.10).
The routine creates a grid of points on the segmented image, with coordinates at a distance between pixels adjustable by the user, and a minimum of 30 points is recommended to represent the variability of the surface temperature of the animal’s body. The procedure creates a vector of temperatures corresponding to the rainbow color palette. In this process, the routine goes through the entire image, the pixels are analyzed in RGB format, and an interpolation function (Equation (4)) is used to convert the color values of these pixels into temperatures, based on the established palette.
f   ( y ) = f ( y 0 ) + f ( y 1 ) f ( y 0 ) x 1 x 0 ·   ( x x 0 )
where, f(x) is the interpolated temperature, f(y1) and f(y0) are the temperatures associated with colors of two known points on the palette, and x1 and x0 are the indices of these points on the palette, representing the positions of the closest colors and the second closest.
To validate the temperature data obtained from the infrared images by means of the proposed algorithm, the ThermaCam Researcher software (Version 2.10) was used as a reference tool with a 30-day license, as used in the research of Wziątek-Kuczmik et al. [33]. Sixty images were selected for testing purposes, and from each image, 30 temperature points were extracted from the pigs’ bodies, along with their respective relative coordinates. These coordinates were then input into the proposed algorithm to extract temperatures from the same body locations of the animals (Figure 2). Subsequently, the averages of these temperatures were calculated for each image.
Then, these relative coordinates were employed in the tool developed to extract the corresponding temperature values. The performance of the routine was evaluated using the following metrics: Mean Absolute Error (MAE), Root Mean Squared Error (RMSE) and the Coefficient of determination (R2).
The MAE quantifies the mean of the absolute differences between the ThermaCam program and the values estimated by the proposed algorithm. The mathematical representation is explicit in Equation (5).
MAE = 1 n   ×   1 n n | y   y ^ |
where
  • n is the number of samples,
  • y is the observed value for each sample,
  • p is the predicted value by the model for each sample,
  • | | represents the absolute value.
The Root Mean Squared Error (RMSE) calculates the square root of the mean squares of the differences between the ThermaCam program (Version 2.10) and the proposed algorithm (Equation (6)).
RMSE = 1 n × 1 n n ( Temp .   do   ThermaCam     Temp .   proposed   algorithm ) 2
The Coefficient of Determination (R2) represents the proportion of the variation in the values obtained by the proposed algorithm that can be explained by the variation in the values obtained by the ThermaCam program (Version 2.10). It provides a measure of the quality of the prediction, ranging from 0 to 1. An R2 closer to 1 indicates a more accurate prediction (Equation (7)).
R 2 = 1 1 n n ( Temp .   Software     Temp .   esteemed ) 2   1 n n ( Temp .   Software     M é dia   da   Temp .   esteemed ) 2  

Recognition and Interpretation—Classifier

For the classification of animals in situations of comfort or thermal discomfort, a Machine Learning method was used, the Support Vector Machine (SVM). According to Rodriguez et al. [34] This method is used for pattern recognition in order to find decision limits that optimally separate classes, reducing classification errors. The SVM looks for a hyperplane or line that separates the data into the comfort and discomfort classes, so that they are as far away from each as possible. In accordance with Alfarzaeai et al. [35], this model demonstrates good performance on small and medium-sized datasets.
Skin surface temperature data were used as input, automatically extracted from the thermal images in the previous step. The images were separated by treatment (air-conditioned and non-air-conditioned environment). From each thermal image, a total of 30 points were extracted, totaling 6780, and averages were made. Each image was given a label identifying the situation in which the animals were inserted, where animals in air-conditioned environments were given the label “comfort” (coded as 0), while those in non-air-conditioned environments were given the label “discomfort” (coded as 1), and these labels were recorded in an Excel spreadsheet.
The training was conducted on Google Colab, starting with the importation of the necessary libraries, such as pandas, scikit-learn, and matplotlib, which were used for data manipulation, model training, and performance evaluation.
To prepare the training data before modeling, the data were normalized so that the values were within the range 0 or 1 and the variables of different quantities had the same relevance to the algorithm. Normalization was performed according to Equation (8), in which Xmax and Xmin are the highest and lowest values of the variable, respectively:
X   norm = x     xmin xmax     xmin
To find the best combination of hyperparameters, a random search algorithm was implemented, which randomly selects values for the parameters of the SVM algorithm, such as kernel function, kernel scale, and C value, responsible for regulating the maximum penalty applied to observations that violate the margin. After the algorithm was executed, the most effective hyperparameters were identified and selected to proceed with the model training. Using the radial base kernel (RBF) functions, the training was conducted with the optimized model and predictions were performed on the test set.
In order to obtain a more robust evaluation of classifier performance, a 5-fold cross-validation was used. The dataset was randomly partitioned into two subsets, with 70% (42 samples) for training and 30% (18 samples) for prediction. This process was repeated in five iterations, each randomly selecting two different thirds of the data for training and the remaining third for prediction, in order to mitigate possible biases in the evaluation of the model.
The performance evaluation included metrics such as precision (Equation (3)) and accuracy (Equation (9)).
Accuracy = PV + VN PV + VN + PF + VF
In addition, the AUC (Area Under the Curve) index or area under the curve was evaluated. This curve is the ROC (Receiver Operating Characteristic) and represents the rate of true positives in relation to the rate of false positives for different classification threshold values. The closer the AUC is to 1, the better the model performs in distinguishing between classes. If the AUC has a value below 0.5, the model performs similarly to chance.

3. Results

3.1. Preprocessing

The results of the pre-processing of the images for the application of the Otsu method are shown in Figure 3. In this figure, it is possible to observe the structural similarity index (SSIM) of the enhanced images obtained by filtering the bilateral and median methods, respectively.
It is noted that the SSIM of the bilateral filter ranges from 0.68 to 0.90, while the median filter ranges from 0.69 to 0.93. The mean SSIM value for bilateral and median filtering presents similar values, with 0.82 and 0.86, reducing the improvement in the quality of the processed images. However, it is important to emphasize that the images enhanced by the median method outperform those filtered by the bilateral method, since they can reduce noise without compromising the information in the images, showing that the median method in noise suppression was more effective. These observations are in line with studies by Draz et al. [36] which highlight the effectiveness of the median filter in situations of high noise level.
The results obtained are close to the findings of the study carried out by Singh et al. [18], in which they applied the median filter to images of human lungs and found a mean for SSIM of 0.92. In the results of the study conducted by Aghamaleki and Ghorbani [37], it was found that the values of SSIM with a median filter range from 0.974 to 0.999. This shows that the application of the filters improved the quality of the image after the pre-processing techniques.
The authors Bose et al. [38] justify the use of the application of pre-processing techniques such as filtering and segmentation, before the extraction of information from the thermal images as a necessary step, with implications for the quality and usefulness of the data obtained. These techniques favor the elimination of noise and the improvement of characteristics, contributing to accurate and reliable diagnoses. Filtering allows images to become sharper, which contributes to easier interpretation, while segmentation is important for identifying and isolating regions of interest. Automating these processes can be particularly helpful.
Figure 4 shows the histograms of the original image (a) and the enhanced image (b), showing the distribution of the intensity of the pixels (shades of gray) in relation to the different gray levels on the scale of 0 to 255.
In Figure 4, each bar of the histogram represents the number of pixels that have an intensity corresponding to a specific range of shades of gray in the image. The peak at 2363 indicates that there are 2363 pixels in the image with a gray level of 133 (Figure 4a). However, it is possible to observe that several other pixel peaks are distributed throughout the image. This finding is in agreement with the findings of Zhang et al. [32], which also identified multiple peaks in the image’s histogram. These peaks can be attributed to discrepancies in thermal radiation between the background and the region of interest in the image, indicating a concentration of values in specific ranges. This variation in the histogram highlights the complexity of the intensity distribution in the thermal image, related to the distinct characteristics of different areas of the scene.
In Figure 4b, the equalization of the histogram contributed to distributing the intensities more evenly, redistributing the intensities of the pixels in the image along the range of available intensities. This change was observed in the areas where the contrast was improved, and the differences between the gray levels were magnified. However, even with these improvements, there remain 18,303 pixels with a gray level of 0 and 8200 pixels with a gray level of 150. These peak values on the gray scale indicate a significant concentration of pixels corresponding to these specific temperatures, suggesting that a large portion of the thermal image area has temperatures close to these values. In the context of thermal images, these peaks help identify regions with uniform temperatures or specific thermal characteristics. It is worth noting that, according to Zhu et al. [39] Histogram equalization can transform the distribution of intensities, widening the dynamic range of gray level differences between pixels and improving the overall contrast of the polarizing thermal image, although some specific values may still persist.
Figure 5 shows the results of the images in the pre-processing step, in which bilateral and median filters were applied to the images converted to the HSV space, followed by histogram equalization.
The image of the animal before the execution of the pre-processing steps may be noisy, which compromises its quality for segmentation. In addition, in certain areas, the distinction between the animal’s body and the background becomes blurred, making the segmentation process more difficult. Figure 5a shows in the grayscale that the H channel (hue), in the HSV color space, captured the hue and isolated the color information of the image, highlighting the object of interest.
The reduction in noise is noticeable in Figure 5b, thanks to the application of the bilateral filter that softened the image. This effect is most evident in Figure 5c, where the median filter maintained the sharpness of the object’s edges, while smoothing out noise. These results were corroborated by SSIM, in which images treated with the bilateral filter exhibited a mean SSIM of 0.82, indicating lower image quality compared to the median filter. The median filter showed an average SSIM of 0.86, highlighting a superior performance in preserving the structure and details of the image.
In Figure 5d, the contrast has been improved, highlighting the difference between light and dark areas of the image. This step is essential to highlight the details of the image and facilitate segmentation. However, it is important to note that, even with these enhancements, the residual presence of noise, especially in the vicinity of the animal’s body, can affect the segmentation process in some images. This behavior is consistent with the results of the research conducted by Nazarudin et al. [40] that applied histogram equalization in medical images, resulting in a greater distinction between the light and dark areas of the images. Thus, the median filter was chosen to smooth and remove noise from the images.

3.1.1. Segmentation by the Otsu Method

Figure 6 shows the results of the application of the pre-processing techniques in the original image (Figure 6a) and in the filtered images (Figure 6b,c), as well as the results of the Otsu binarization (Figure 6d), the inverted binarization (Figure 6e) and the extracted contour (Figure 6f).
In Figure 6b, the image was converted to gray, in Figure 6c,d, the image was improved, and the noises were removed. Otsu’s method automatically calculated an intensity threshold that separated the object from the background, making the image binarized, in which the pixels were classified as animal in black (value 0) and the background in white (value 255) (Figure 6e). With this, it was possible to observe that only the pixels associated with the segmented objects remained visible in the image, while the background pixels were suppressed, resulting in a representation where the background is represented as black (value 0) (Figure 6f). In Figure 6g, after applying Otsu’s binarization, the inversion operation was applied, in which the white values become black and the black values become white, to highlight the object of interest. From this, it was possible to extract the outline of the segmented image (Figure 6h).
In this way, it was possible to see that the regions of interest stand out clearly in relation to the background, providing a clear representation and facilitating the analysis and identification of the areas of interest. However, the proposed segmentation algorithm demonstrates limitation when the animals appear close in the scene, or when only part of them are visible, as illustrated in Figure 7.
In these cases, the algorithm’s ability to differentiate objects of interest from the background stands out (Figure 7b–f), allowing it to recognize and isolate animals even when they are close to or partially visible in the image. This makes the method suitable for a variety of situations and scenarios where object segmentation is required, regardless of the complexity of the scene. However, the algorithm cannot distinguish the individual contour of the animals when they share the same area, or are lying on top of each other (Figure 7g,h).
The algorithm also has limitations in finding a suitable threshold for all images, due to the dissimilarity in the characteristics of the images. Some images have well-defined edges, while others have low or high contrast. However, this information is consistent with that found by Tamoor, Naseer and Khan [41] which notes the absence of an effective universal method for all types of images.

3.1.2. Color Targeting

The automatic segmentation system based on the color method demonstrated a satisfactory performance in the segmentation of the thermal images of the animals. Figure 8 illustrates the result of the segmentation.
In Figure 8a, the segmentation algorithm combined several color ranges based on the predefined ranges and highlighted the region of interest in the thermal image. In Figure 8b, the algorithm converted the segmented image to grayscale, binarized the image (Figure 8c), and extracted the outline (Figure 8d).
Despite presenting good results in the segmentations, the algorithm acted in a similar way to Otsu’s segmentation and was unable to individualize the animals, as can be seen in Figure 9.
In Figure 9a, the segmentation algorithm highlighted the areas of interest contained in the image. In Figure 9b, the algorithm converted the segmented image to grayscale. In Figure 9c, a mask representing the object of interest was generated in white (value 255) and the rest of the scenario in black (value 0) and in Figure 9d, the algorithm was unable to determine the contours of the segmented objects, based on the application of the OpenCV findContours algorithm and the extraction of a single contour for the two animals, which represents a limitation for the proposed algorithm. These observations are approved in the studies of Carvalho and Coelho [42] that highlight the effectiveness of thermal image segmentation for the detection of objects of interest.

3.1.3. Measurement Metrics for Your Targets

The average performance of the Jaccard Index was 0.89. This finding is close to the values found in the study by Dumitru et al. [43] which found a Jaccard index ranging from 0.70 to 0.90. This demonstrates that this segmentation is capable of extracting the region of interest and can be used as the reference region. According to Queiroz et al. [44], the reference segmentation needs to represent the region of interest in the image under analysis; however, achieving perfect segmentation, with a Jaccard index equal to 1, is a goal that does not occur in practice.
This result is in line with the findings of the survey conducted by Zhang et al. [32], which also employed the technique of manual segmentation, without achieving perfect results.
The results of the segmentations obtained through the Otsu method and color segmentation are evaluated based on the mean of the Dice coefficient, the Jaccard index, and the precision between the automatically segmented and manually segmented images. This procedure was adopted with the specific purpose of identifying the most effective combination of these methods, aiming at optimizing the segmentation process in this phase of the research. Table 3 presents the comparison of the average performance of evaluation metrics for the two proposed segmentation methods.
The evaluation of the data from all images revealed that the Dice index reached mean values of 0.89 for Otsu and 0.90 for the color segmentation method, respectively. It was noted that the color segmentation method segmented the animals more precisely, because it had a higher Dice coefficient. A higher Dice coefficient indicates better targeting effectiveness [45]. The Dice coefficient according to the research of Yan et al. [46], serves as an indicator of the proportion of the area that has been correctly segmented compared to the total area that has been manually segmented. The closer the Dice coefficient is to 1, the greater the overlap and agreement between the automatically and manually segmented areas. These results demonstrate a proximity to the findings of the research conducted by Santos et al. [47], in which the segmentation method employed reached a Dice value of 0.90.
The Jaccard index achieved an average of 0.81 for the Otsu segmentation method and 0.83 for the color segmentation. Thus, the average Jaccard index demonstrates that the images segmented by color were the ones that presented results closest to the image segmented manually. Sharma et al. [48] highlight that the effectiveness of an analyzed segmentation method can be evaluated considering the Jaccard index. According to their conclusions, a Jaccard index closer to 1 is associated with superior performance, thus, higher values indicate a greater agreement between the results obtained and the established references. In the study conducted by Kumar et al. [47], a Jaccard index of 0.82 was identified in its segmentation method.
The average accuracy for Otsu’s method is 87%, while for Color targeting it is 88%. These results are close to those obtained in the study by Gomathi et al. [49], where the average accuracy ranged from 92 to 97%. Aleid et al. [50] already achieved superior accuracy, with a value of 99.5% in their segmentation results.
These metrics, according to Zhang et al. [32], have the ability to determine whether or not the targeted method being evaluated is applicable. Thus, for the extraction of characteristics, the color segmentation method was chosen, as it presented the best segmentation contour based on the performance evaluation metrics.
In general, the results obtained in the overlapping measures showed that there was an 81% intersection between the automatically segmented regions and the manually segmented regions. The segmentation algorithm was able to delimit the region of interest, showing that the automatic segmentation evaluated can be implemented in the analysis of thermal data with low computational requirements and highlights its potential to contribute to the diagnosis of heat stress in animals.

3.2. Feature Extraction

The results of the regression metrics, including MAE, RMSE and R2, indicated values of 0.20 and 0.07 for the air-conditioned environment, and 0.25 and 0.09 for the non-air-conditioned environment, respectively. The performance of the temperature extraction algorithm in relation to the program is shown in Table 4.
The average surface temperatures of the animals extracted from the thermal images throughout the experimental period, obtained by the proposed algorithm and the ThermaCam program in the climatized environment at 08:00, 12:00, and 16:00 h, are shown in Figure 10.
Figure 10a shows that during the morning period, the average surface temperatures of the pigs in the climatized environment ranged from 34.2 to 37.2 °C. At 12:00 h, the temperature range reached values medium of the order of 37.1 to 38.6 °C, and around 16:00 h, the skin temperatures of the animals decreased, oscillating between 35.2 and 37.6 °C. The mean difference between the temperature values obtained by the proposed algorithm and the ThermaCam program was 0.20 °C, the maximum was 0.80 °C and the minimum was 0.00 °C. At some points, the proposed algorithm provides slightly lower temperatures than ThermaCam, while in other cases, the temperatures extracted by the algorithm are slightly higher. In the analysis of Figure 10b, it is highlighted that the coefficient of determination reached 0.96, indicating that approximately 96% of the variability present in the thermal data, as extracted by the proposed algorithm, can be explained by the algorithm, demonstrating correspondence between the temperatures estimated by the algorithm and those from the ThermaCam program.
The average surface temperatures from the thermal images of the animals recorded at 08:00, 12:00, and 16:00 h, of the experimental period obtained by the proposed algorithm and the ThermaCam program in the non-climatized environment, are shown in Figure 11.
It can be observed in Figure 11a that the average temperature variation at 08:00 h was from 34.95 to 37.49 °C, at 12:00 h the variation was from 35.18 to 41.50 °C and at 16:00 h the variation was from 33.77 to 38.31 °C. The mean difference between the temperature values estimated by the proposed algorithm and the ThermaCam program was 0.25 °C, the maximum was 0.82 °C and the minimum was 0.01 °C. In Figure 11b, it is observed that the coefficient of determination reached 0.93, indicating that approximately 93% of the variation in the thermal data, as extracted by the algorithm, can be explained by the algorithm itself. This indicates a significant relationship between the estimated temperatures and those coming from the ThermaCam program, which shows the capability of the temperature extraction technique.
Regarding the differences found between the temperatures extracted from the thermal images by the tool of this study and the reference program used, it is noted the need for adjustments in the algorithm to improve its accuracy in comparison with an already established software. The average differences found between the proposed algorithm and ThermaCam may be related to the use of the mathematical model of linear nature used, since the association of pixels with the temperatures of the thermal images may not be adequately represented by a linear function. This is due to the pattern of the animals’ thermal responses, which vary in different environmental conditions, according to the research conducted by Tito et al. [6] where they argue that the accuracy of thermal imaging is influenced by the environment. Wang et al. [51] mentioned that precipitation, wind, humidity, wind and air temperature affect the results of the thermal images.
In the environment without air conditioning, the absence of fog under the animals’ skin impacts the thermal patterns and the relationship between the pixel values and the actual temperatures. In addition, the behavior of animals, such as lying on feces and urine to dissipate heat, contributes to this nonlinearity. Environmental factors as well, such as temperature and humidity, create a more complex environment, which also influence nonlinear thermal responses. Using a linear function to associate pixels with temperatures does not take into account the complex, nonlinear thermal dynamics that occur in uncontrolled environments.
Although there are differences between the estimated and reference temperatures, the proposed algorithm offers significant advantages for the extraction of temperatures from thermal images. As pointed out by Irujo [52], programs integrated into infrared cameras often have limited functionality due to their closed-source nature, and are designed primarily for non-scientific users due to financial costs. On the contrary, the tool developed in this study stands out for its potential in the scientific environment, since it was developed in open source and has flexibility for adaptations.
In addition, in the software associated with thermal images, as indicated by Nosratil et al. [53], the manual choice of temperature points or areas is required, which becomes impractical when there is a need to handle a large volume of images. The method proposed in the study employs a user-defined pixel spacing (a minimum of 30 points is recommended), with extraction performed in a fully automated manner. This way of obtaining temperatures in a fixed spacing between pixels was also adopted in the work of Borges et al. [54], where the authors made use of temperature analysis of thermal images of a calf hoof by defining a fixed distance between pixels.
Another point highlighted in the investigation of Crameri et al. [55] is the choice of different color palettes in software used for thermal imaging, which can compromise the accurate representation of the data. The method adopted in this study uses the color palette recommended for biological tissues, with the rainbow with cooler values in black and warmer values in white. For thermal imaging this palette is the most suitable, as indicated by Shaikh et al. [56] for indicating the least temperature variation.
Other open-source algorithms have been created to obtain raw data from thermal images; however, they depend on a second additional software to extract the data from the images, such as the open-source IRimage software developed in the research of Irujo [52] which uses the Exitool software as an add-on. The method suggested in this study uses the information from the color palette and mathematical interpolation to acquire the raw temperature values of the images.
The proposed algorithm can also receive a file containing the relative coordinates and extract the temperature values from these coordinates. In addition, it also automatically determines the hottest regions of the animal’s body, which can be useful in identifying localized diseases, or in identifying the regions that have the greatest correlation with rectal temperature, such as the ears and eye region, as shown in Figure 12.
As illustrated in Figure 12, the proposed algorithm highlighted the eye and ear region as the areas with the highest temperature of the animal. The areas identified by the algorithm coincide with those investigated by Gorczyca et al. [57] that sought to predict rectal temperature in pigs through neural networks. The animals’ surface temperatures correlated with core temperatures, and the researchers observed that the ear and eye regions exhibited the highest temperatures.
In addition, the relationship between the areas identified by the algorithm and the regions investigated by Gorczyca et al. [57] strengthens the reliability of the results obtained. The identification of the eye and ear regions as the warmest may have significant implications, suggesting the viability of these areas as reliable indicators of pig indoor temperatures. This correspondence between the algorithm’s results and previous research validates the effectiveness of the proposed approach. The accuracy in identifying these hot regions reinforces the usefulness of the algorithm not only in thermal detection, but also in the potential for predicting physiological parameters essential for monitoring animal health.

3.3. Thermal Comfort Assessment

The environmental variables recorded in the air-conditioned stalls indicated a variation in air temperature from 23.08 to 22.87 °C and in relative humidity from 70.30 to 84.79%. These temperature values place the facility within the appropriate air temperature ranges for animals in the growing phases (22–26 °C) as defined by Perdomo et al. [58]. The results obtained in this environment showed temperature values close to those of Kiefer et al. [59] which found temperature values between 18 and 26 °C and RH of 75% for pigs growing in air-conditioned environments.
The temperature and relative humidity of the air in the stalls without air conditioning varied from 24.55 to 34.94 °C and from 74.72 to 50.64%, respectively, values that exceed the indicative threshold of thermal comfort as defined by Perdomo et al. [58].
The results of the statistical analysis demonstrated the thermal profile of the animals submitted to the different air conditioning systems (Table 5).
The average skin surface temperature of the animals submitted to the acclimatized environment ranged from 30.40 to 35.70 °C. The mean minimum and maximum temperature values in the skin of the animals were 22.80 and 38.40 °C. The mean value identified in this research is close to the data obtained in related studies that studied the analysis of the skin temperature of pigs in air-conditioned environments. As evidenced by Crone et al. [60], the average temperature recorded during the transport of these animals in air-conditioned conditions was 34 °C. In an analogous way, Santos et al. [61] observed in their study that the average temperature of pigs in an environment provided with water depth was 34.50 °C. In the research of Vásquez et al. [62] in pigs within rearing environments within the thermoneutral zone for growing animals (around 23 °C) showed skin values of 39.1 °C.
The average temperature of the animals subjected to the natural environment (without air conditioning) was 33.02 to 38.38 °C. The minimum and maximum values found on the surface of the animals in the natural environment were 34.10 and 42.80 °C. The maximum temperature of 42.80 °C recorded on the skin of the animals demonstrates that the animals presented high heat stress, since the literature presents a maximum temperature of 36 °C for animals in a comfortable temperature environment [60]. This shows that animals subjected to an environment without an air conditioner exceed their normal body temperature. It should be noted that this scenario illustrates an expected pattern showing that when animals are at temperatures outside the thermoneutral zone, the effects of the microclimate are observed on the surface of the animals.
Similar results were identified in the study by Brown-Brandl et al. [4], which revealed maximum temperatures in the skin of pigs ranging from 40 °C for Duroc breeds to 41.9 °C for pigs of other breeds, at different air temperatures. In the research developed by [62], the skin temperature of growing pigs, in natural environments, ranged between 41.0 and 41.5 °C.
In the research conducted by Jia et al. [63], maximum temperature values were identified in the skin of pigs, reaching 39.12 °C in an environment with an air temperature of 33.10 °C and relative humidity of 91.80%. The authors point out that air temperature and relative humidity influences the skin temperatures of animals through processes such as convection, conduction and other heat transfer mechanisms. These findings show that skin temperature results from a complex interaction between environmental factors and internal body temperature.
The study identified thermal patterns in the animals under the different production environments. The thermal profile of the animals submitted to heat stress showed a different perceptible pattern from the animals under air conditioning. Based on research by Alves et al. [64] Understanding how pigs respond to various environmental conditions is essential to ensure their well-being and optimize production efficiency.
By analyzing the thermal profile of pigs, researchers and producers can develop effective temperature control strategies of the rearing environment to mitigate the potential adverse effects of temperature extremes on pig behavior and productivity.

3.4. Sorter

In the classification scenario, the SVM demonstrated that it was able to classify animals in situations of thermal comfort and discomfort using the information on the skin surface temperature, as shown in Table 6.
The model correctly classified 91% of the instances and provided 80% class accuracy. In the work of Wang et al. [51], the SVM was used to establish estrus in cows, using the thermal infrared temperature variation, and achieved an accuracy of 81.42%.
In the study by Sadeghi et al. [65], the overall accuracy of the SVM in identifying birds with avian influenza infections was 80.56%, based on 120 thermal images. The infections caused elevations in the birds’ temperature, which were visible in the thermal images, allowing for effective detection using the SVM model.
In the work presented by Jaddoa et al. [66], an automatic method of ocular segmentation based on thermal images was proposed. The use of the SVM as a classifier was adopted to segment the cow’s face region. Subsequently, automatic boundary processing, hot spot detection, and refinement techniques were applied to target the animal’s eyes. The results revealed that the accuracy of this method reached 72.12%.
To perform automatic positioning of the cows’ eyes and udders, Saeedi et al. [67] analyzed the thermal infrared image histogram, followed by automated detection of eye position based on the hue, saturation, and value (HSV) components in the image. They then used SVM sorting technology to automatically identify the position of the cow’s udder. The results indicated an accuracy of 68.67% in recognizing video frames, with a positioning error of less than 20 pixels. In addition, the detection accuracy of this method for clinical mastitis reached 87.5%.
In the research conducted by Saeedi et al. [67], the accuracy of the SVM was 80% to classify brain tumors in medical images. In the research of Mcintyre and Tuba [68], the authors used the SVM to classify brain tumors in medical images and obtained an accuracy of 91.21%.
In the study by Yang et al. [69], to classify broilers at rest using SVM, the following accuracies were achieved with accelerometer data: 88% and 75% for feeding behaviors, and 83% and 62% for drinking behaviors.
Thus, the SVM model recognized patterns in the temperature distributions that characterize animals under comfort and thermal discomfort. Pigs are sensitive to heat and tend to offer responses from the environment where they are subjected. Surface temperatures and specific behaviors, such as lying down, are significantly correlated with thermal comfort, conformable Andersen et al. [70], which cite that skin temperature, as well as lying behavior are used as an index of the thermal state of pigs, which is significant and was observed in this research. Lying down in the fully reclined position increases heat loss with the floor.
These data can be considered indicators to estimate the thermal comfort of pigs in different thermal environments. Thermal imaging has the ability to provide accurate information about the condition of animals, especially in relation to heat stress, as images with higher temperatures can indicate that animals are under heat stress.
By analyzing both the surface temperatures of the pigs, the SVM model classified the images, contributing to the early identification of heat stress situations in the animals and facilitating the implementation of corrective measures. The tool resulting from this study can be used as a diagnostic aid, assisting veterinarians in the adoption of preventive measures, as the harmful effects of heat stress in pigs range from reduced performance to health problems.
The proposed algorithm can be employed to analyze original thermal images acquired through a thermal camera connected to a computer and cloud storage. For Hoffer et al. [71], these thermal cameras have a relatively low cost, which can be an advantage for the implementation of the proposed algorithm. The proposed algorithm uploads the image, performs analysis, segmentation and detection of pig body surface temperatures based on the captured images, eliminating the need for direct contact. This technique simplifies the monitoring of animals, allowing the identification of higher body temperatures and can assist in veterinary medical diagnosis.
According to Conceição et al. [72], body temperature is an important indicator of animal health, since it can detect stress, local inflammation and pathologies in general. In this way, this tool can contribute significantly to swine production, since it has the potential to identify the presence of diseases early and prevent its spread, helping to control and optimize the allocation of resources in animal production, through more efficient monitoring of pigs.
In addition, the proposal can be adapted for respiratory rate accounting, as demonstrated by Stewart et al. [73], who used thermal images of cattle nostrils to measure the temperature difference during inspiration and expiration, validating the use of thermography as a non-invasive and remote technique for measuring respiratory rate. These applications will result in benefits for both the health of the animals and the efficiency of pig production as a whole.
The use of this procedure occurs without direct contact with the animal, which contributes to the accuracy of data collection and promotes the well-being of the pigs, ensuring a quieter environment free of unwanted human interference, as explained by Coşkun et al. [8]. In this way, the proposal may represent a significant advance in the way of monitoring animals in pig production.

4. Conclusions

This study presented a sequential methodology for the automated extraction of features in thermal images, followed by the classification of heat stress in pigs through machine learning techniques. The most efficient segmentation method proved to be based on color, with a similarity coefficient of 0.90, and the results of the feature extraction, estimated by the proposed algorithm, diverged by 0.80 °C in relation to the reference program. The machine learning model exhibited an accuracy of 80% and an accuracy of 91%, allowing automatic detection and classification of pigs in states of thermal comfort and discomfort. For future improvements, it is suggested to explore the application of more complex mathematical models in the feature extraction algorithm and perform tests with different machine learning models.

Author Contributions

Conceptualization, M.d.F.A.A., H.P. and G.L.P.d.A.; methodology, M.d.F.A.A. and H.P.; software, M.d.F.A.A., R.G.F.S. and T.C.S.; validation, M.d.F.A.A., R.G.F.S. and T.C.S.; Resources, M.V.d.S.; formal analysis: M.d.F.A.A. and T.C.S.; data curation, M.d.F.A.A. and R.G.F.S.; writing—original draft, H.P. and G.L.P.d.A.; writing—review and editing, G.L.P.d.A., T.C.S. and M.V.d.S.; visualization, H.P., G.L.P.d.A. and M.V.d.S.; funding acquisition, H.P. and M.V.d.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The research was approved by the CEUA/UFRPE (Animal Ethics Committee) under protocol number 23082.021090/2016-81, license issuance date: 5 December 2016.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable to this article.

Acknowledgments

To the Programa de Pós-Graduação em Engenharia Agrícola (PGEA) and the Grupo de Pesquisa em Ambiência (GPESA) of the Universidade Federal Rural de Pernambuco (UFRPE) for the necessary support for the development of this study. The Coordenação para o Aperfeiçoamento do Pessoal do Ensino Superior (CAPES—Financial Code 001) and the Fundação de Apoio à Investigação do Estado de Pernambuco (FACEPE), for the incentive of research scholarships.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Machado, S.T.; Nääs, I.D.A.; dos Reis, J.G.M.; Costa Neto, P.L.D.O.; Toloi, R.C.; Santos, R.C.; Vendrametto, O.; Sanches, A.C. Impacto do microclima do caminhão na temperatura superficial de suínos durante a logística pré-abate. Res. Soc. Dev. 2021, 10, 13. Available online: https://rsdjournal.org/index.php/rsd/article/view/21077 (accessed on 3 September 2024).
  2. de Oliveira Alves, D.; Przyvara Rizzatti, E.; Alves Feitosa Filho, L.; Grisa Hahn, K. Custo de produção da suinocultura: Comparativo de rentabilidade da suinocultura de cria e recria nos períodos de 2019 a 2022, em uma unidade produtiva situada no município de Ampére no sudoeste do Paraná. RECIMA21 Rev. Científica Multidiscip. 2023, 4, e414465. [Google Scholar] [CrossRef]
  3. Cai, Z.; Cui, J.; Yuan, H.; Cheng, M. Application and research progress of infrared thermography in temperature measurement of livestock and poultry animals: A review. Comput. Electron. Agric. 2023, 205, 107586. [Google Scholar] [CrossRef]
  4. Brown-Brandl, T.M.; Hayes, M.D.; Rohrer, G.A.; Eigenberg, R.A. Thermal comfort evaluation of three genetic lines of nursery pigs using thermal images. Biosyst. Eng. 2023, 225, 1–12. [Google Scholar] [CrossRef]
  5. da Silva Rodrigues, A.V.; Martello, L.S.; Pacheco, V.M.; de Souza Sardinha, E.J.; Pereira, A.L.V.; de Sousa, R.V. Thermal signature: A method to extract characteristics from infrared thermography data applied to the development of animal heat stress classifier models. J. Therm. Biol. 2023, 115, 103609. [Google Scholar]
  6. Titto, C.G.; Henrique, F.L.; Pantoja, M.H.D.A.; Çakmakçı, C.; Silva, P.D.S. Editorial: Behavior and heat stress. Front. Vet. Sci. 2023, 10. [Google Scholar] [CrossRef]
  7. Wang, Z.; Wang, S.; Wang, C.; Zhang, Y.; Zong, Z.; Wang, H.; Su, L.; Du, Y. A Non-Contact Cow Estrus Monitoring Method Based on the Thermal Infrared Images of Cows. Agriculture 2023, 13, 385. [Google Scholar] [CrossRef]
  8. Coşkun, G.; Şahin, Ö.; Delialioğlu, R.A.; Altay, Y.; Aytekin, İ. Diagnosis of lameness via data mining algorithm by using thermal camera and image processing method in Brown Swiss cows. Trop. Anim. Health Prod. 2023, 55, 50. [Google Scholar] [CrossRef]
  9. Wilson, A.N.; Gupta, K.A.; Koduru, B.H.; Kumar, A.; Jha, A.; Cenkeramaddi, L.R. Recent Advances in Thermal Imaging and its Applications Using Machine Learning: A Review. IEEE Sens. J. 2023, 23, 3395–3407. [Google Scholar] [CrossRef]
  10. Godyń, D.; Herbut, P. Applications of continuous body temperature measurements in pigs—A review. Anim. Sci. For. Wood Technol. Hortic. Landsc. Archit. Land Reclam. 2018, 56, 209–220. [Google Scholar] [CrossRef]
  11. He, J.; Zhang, X.; Li, S.; Gan, Q. Effects of ambient temperature and relative humidity and measurement site on the cow’s body temperature measured by infrared thermography. J. Zhejiang Univ. (Agric. Life Sci.) 2020, 46, 500–508. [Google Scholar]
  12. Kadirvel, G.; Gonmei, C.; Singh, N.S. Assessment of Rectal Temperature using Infrared Thermal Camera in Pigs. Indian J. Sci. Technol. 2022, 15, 2041–2046. [Google Scholar] [CrossRef]
  13. Wang, B.; Qi, J.; An, X.; Wang, Y. Heterogeneous fusion of biometric and deep physiological features for accurate porcine cough recognition. PLoS ONE 2024, 19, e0297655. [Google Scholar] [CrossRef]
  14. Wang, S.; Jiang, H.; Qiao, Y.; Jiang, S. A Method for Obtaining 3D Point Cloud Data by Combining 2D Image Segmentation and Depth Information of Pigs. Animals 2023, 13, 2472. [Google Scholar] [CrossRef] [PubMed]
  15. Küster, S.; Haverkamp, L.; Schlather, M.; Traulsen, I. An Approach towards a Practicable Assessment of Neonatal Piglet Body Core Temperature Using Automatic Object Detection Based on Thermal Images. Agriculture 2023, 13, 812. [Google Scholar] [CrossRef]
  16. Xiong, Y.; Li, G.; Willard, N.C.; Ellis, M.; Gates, R.S. Modeling neonatal piglet rectal temperature with thermography and machine learning. J. ASABE 2023, 66, 193–204. [Google Scholar] [CrossRef]
  17. Tucker, B.S.; Jorquera-Chavez, M.; Petrovski, K.R.; Craig, J.R.; Morrison, R.S.; Smits, R.J.; Kirkwood, R.N. Comparing surface temperature locations with rectal temperature in neonatal piglets under production conditions. J. Appl. Anim. Res. 2023, 51, 212–219. [Google Scholar] [CrossRef]
  18. Singh, O.; Kashyap, K.L.; Singh, K.K. Meshless technique for lung computed tomography image enhancement. Biomed. Signal Process. Control. 2023, 81, 104452. [Google Scholar] [CrossRef]
  19. Liu, Y.; Wang, F.; Liu, K.; Mostacci, M.; Yao, Y.; Sfarra, S. Deep convolutional autoencoder thermography for artwork defect detection. Quant. Infrared Thermogr. J. 2023, 1–17. [Google Scholar] [CrossRef]
  20. McManus, R.; Boden, L.A.; Weir, W.; Viora, L.; Barker, R.; Kim, Y.; McBride, P.; Yang, S. Thermography for disease detection in livestock: A scoping review. Front. Vet. Sci. 2022, 9, 965622. [Google Scholar] [CrossRef]
  21. Colaco, S.J.; Kim, J.H.; Poulose, A.; Neethirajan, S.; Han, D.S. DISubNet: Depthwise Separable Inception Subnetwork for Pig Treatment Classification Using Thermal Data. Animals 2023, 13, 1184. [Google Scholar] [CrossRef] [PubMed]
  22. Jiao, F.; Wang, K.; Shuang, F.; Dong, D.; Jiao, L. A Smartphone-Based Sensor with an Uncooled Infrared Thermal Camera for Accurate Temperature Measurement of Pig Groups. Front. Phys. 2022, 10, 893131. [Google Scholar] [CrossRef]
  23. Nolêto, R.M.A.; Nolêto, C.; Santos, N.P.S.; Madeira, A.M.A. Inovações no Reconhecimento e Detecção de Animais: Uma Análise da Literatura com Ênfase em Redes Neurais e Aprendizado de Máquina. In Anais do XVI Encontro Unificado de Computação do Piauí (ENUCOMPI 2023); Sociedade Brasileira de Computação, 2023; pp. 33–40. Available online: https://scholar.google.com.br/citations?view_op=view_citation&hl=pt-BR&user=rHzm68cAAAAJ&citation_for_view=rHzm68cAAAAJ:W7OEmFMy1HYC (accessed on 10 October 2023).
  24. Whittaker, A.L.; Muns, R.; Wang, D.; Martínez-Burnes, J.; Hernández-Ávalos, I.; Casas-Alvarado, A.; Domínguez-Oliva, A.; Mota-Rojas, D. Assessment of Pain and Inflammation in Domestic Animals Using Infrared Thermography: A Narrative Review. Animals 2023, 13, 2065. [Google Scholar] [CrossRef] [PubMed]
  25. dos Reis, H.S.; da Paz, C.D.; Cocozza, F.D.M.; de Oliveira, J.G.A.; Silva, M.A.V. Plantas medicinais da caatinga: Uma revisão integrativa dos saberes etnobotânicos no semiárido nordestino. Arq. Ciências Saúde UNIPAR 2023, 27, 874–900. [Google Scholar] [CrossRef]
  26. Diniz, C.D.d.S.C.; Ataíde, E.M. Different substrates in the germination of pomegranate seeds. Braz. J. Anim. Environ. Res. 2023, 6, 1876–1882. [Google Scholar] [CrossRef]
  27. Barbosa Filho, J.A.D.; Silva, I.J.O.; Silva, M.A.N.; Silva, C.J.M. Avaliação dos comportamentos de aves poedeiras utilizando sequência de imagens. Eng. Agrícola 2007, 27, 93–99. [Google Scholar] [CrossRef]
  28. Kalaiyarasi, M.; Janaki, R.; Sampath, A.; Ganage, D.; Chincholkar, Y.D. Budaraju Non-additive noise reduction in medical images using bilateral filtering and modular neural networks. Soft Comput. 2023, 1–10. [Google Scholar] [CrossRef]
  29. Otsu, N.A. Threshold selection method from gray-level histograms. IEEE Transactions on Systems. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  30. Gonzalez, R.; Woods, P. Digital Image Processing. Hall, P., Ed.; 2007. Available online: https://scholar.google.com.br/scholar?hl=pt-BR&as_sdt=0%2C5&q=Threshold+selection+method+from+gray-level+histograms&btnG= (accessed on 10 October 2023).
  31. Bareli, F. Introdução à Visão Computacional: Uma Abordagem Prática com Python e OpenCV; 2019. [Google Scholar]
  32. Zhang, F.; Dai, Y.; Peng, X.; Wu, C.; Zhu, X.; Zhou, R.; Wu, Y. Brightness segmentation-based plateau histogram equalization algorithm for displaying high dynamic range infrared images. Infrared Phys. Technol. 2023, 134, 104894. [Google Scholar] [CrossRef]
  33. Wziątek-Kuczmik, D.; Niedzielska, I.; Mrowiec, A.; Bałamut, K.; Handzel, M.; Szurko, A. Is Thermal Imaging a Helpful Tool in Diagnosis of Asymptomatic Odontogenic Infection Foci—A Pilot Study. Int. J. Environ. Res. Public Health 2022, 19, 16325. [Google Scholar] [CrossRef]
  34. Rodriguez, P.C.L.; Franca, A.S.; Pereira, F.G.; Nunes, R.B.; Cani, S.P.N.; Rampinelli Fernandes, M. Máquina De Vetores De Suporte Para Classificação De Anomalias Em Trilho a Partir De Características De Textura De Imagens Digitais. Rev. Ifes Ciência 2023, 9, 1–12. [Google Scholar] [CrossRef]
  35. Alfarzaeai, M.S.; Hu, E.; Peng, W.; Qiang, N.; Alkainaeai, M.M.A. Coal Gangue Classification Based on the Feature Extraction of the Volume Visual Perception ExM-SVM. Energies 2023, 16, 2064. [Google Scholar] [CrossRef]
  36. Draz, H.H.; Elashker, N.E.; Mahmoud, M.M.A. Optimized Algorithms and Hardware Implementation of Median Filter for Image Processing. Circuits Syst. Signal Process. 2023, 42, 5545–5558. [Google Scholar] [CrossRef]
  37. Aghamaleki, J.A.; Ghorbani, A. Image fusion using dual tree discrete wavelet transform and weights optimization. Vis. Comput. 2023, 39, 1181–1191. [Google Scholar] [CrossRef]
  38. Bose, A.; Maulik, U.; Sarkar, A. An entropy-based membership approach on type-II fuzzy set (EMT2FCM) for biomedical image segmentation. Eng. Appl. Artif. Intell. 2023, 127, 107267. [Google Scholar] [CrossRef]
  39. Zhu, Y.; Nie, X.; Li, Y.; Nie, C.; Wang, C.; Gao, Z. A Novel Fault Diagnosis Method for Train Real-Time Ethernet Network Based on Physical Layer Electrical Signal Features. IEEJ Trans. Electr. Electron. Eng. 2023, 18, 1673–1681. [Google Scholar] [CrossRef]
  40. Nazarudin, A.A.; Zulkarnain, N.; Mokri, S.S.; Zaki, W.M.D.W.; Hussain, A.; Ahmad, M.F.; Nordin, I.N.A.M. Performance Analysis of a Novel Hybrid Segmentation Method for Polycystic Ovarian Syndrome Monitoring. Diagnostics 2023, 13, 750. [Google Scholar] [CrossRef] [PubMed]
  41. Tamoor, M.; Naseer, A.; Khan, A.; Zafar, K. Skin Lesion Segmentation Using an Ensemble of Different Image Processing Methods. Diagnostics 2023, 13, 2684. [Google Scholar] [CrossRef]
  42. Carlos de Carvalho, E.; Martins Coelho, A.; Conci, A.; Baffa, M.D.F.O. U-Net Convolutional Neural Networks for breast IR imaging segmentation on frontal and lateral view. Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2023, 11, 311–316. [Google Scholar] [CrossRef]
  43. Dumitru, R.G.; Peteleaza, D.; Craciun, C. Using DUCK-Net for polyp image segmentation. Sci. Rep. 2023, 13, 9803. [Google Scholar] [CrossRef]
  44. da Queiroz, K.F.F.C.; de Queiroz Júnior, J.R.A.; Dourado, H.; de Lima, R.D.C.F. Automatic segmentation of region of interest for breast thermographic image classification. Res. Biomed. Eng. 2023, 39, 199–208. [Google Scholar] [CrossRef]
  45. Srivastava, S.; Vidyarthi, A.; Jain, S. Analytical study of the encoder-decoder models for ultrasound image segmentation. Serv. Oriented Comput. Appl. 2023, 18, 81–100. [Google Scholar] [CrossRef]
  46. Yan, X.; Lin, B.; Fu, J.; Li, S.; Wang, H.; Fan, W.; Jiang, C. MRSNet: Joint consistent optic disc and cup segmentation based on large kernel residual convolutional attention and self-attention. Digit. Signal Process. 2023, 145, 104308. [Google Scholar] [CrossRef]
  47. Santosh Kumar, P.; Sakthivel, V.P.; Raju, M.; Satya, P.D. Brain tumor segmentation of the FLAIR MRI images using novel ResUnet. Biomed. Signal Process. Control 2023, 82, 104586. [Google Scholar] [CrossRef]
  48. Sharma, N.; Gupta, S.; Al Reshan, M.S.; Sulaiman, A.; Alshahrani, H.; Shaikh, A. EfficientNetB0 cum FPN Based Semantic Segmentation of Gastrointestinal Tract Organs in MRI Scans. Diagnostics 2023, 13, 2399. [Google Scholar] [CrossRef]
  49. Gomathi, P.; Muniraj, C.; Periasamy, P.S. Digital infrared thermal imaging system based breast cancer diagnosis using 4D U-Net segmentation. Biomed. Signal Process. Control 2023, 85, 104792. [Google Scholar] [CrossRef]
  50. Aleid, A.; Alhussaini, K.; Alanazi, R.; Altwaimi, M.; Altwijri, O.; Saad, A.S. Artificial Intelligence Approach for Early Detection of Brain Tumors Using MRI Images. Appl. Sci. 2023, 13, 3808. [Google Scholar] [CrossRef]
  51. Wang, C.; Zhang, Y.; Zhou, Y.; Sun, S.; Zhang, H.; Wang, Y. Automatic detection of indoor occupancy based on improved YOLOv5 model. Neural Comput. Appl. 2023, 35, 2575–2599. [Google Scholar] [CrossRef]
  52. Irujo, G.P. IRimage: Open source software for processing images from infrared thermal cameras. PeerJ Comput. Sci. 2022, 8, e977. [Google Scholar] [CrossRef]
  53. Nosrati, Z.; Bergamo, M.; Rodríguez-Rodríguez, C.; Saatchi, K.; Häfeli, U.O. Refinement and validation of infrared thermal imaging (IRT): A non-invasive technique to measure disease activity in a mouse model of rheumatoid arthritis. Arthritis Res. Ther. 2020, 22, 1–16. [Google Scholar] [CrossRef]
  54. Borges, P.A.C.; Silva, D.C.; da Silva, N.A.A.; Lima, V.H.; Queiroz, P.J.B.; Borges, N.C.; da Silva, L.A.F. Different methods of processing thermographic images to evaluate the carpal temperature of healthy calves. Cienc. Anim. Bras. 2022, 23, e70559. [Google Scholar] [CrossRef]
  55. Crameri, F.; Shephard, G.E.; Heron, P.J. The misuse of colour in science communication. Nat. Commun. 2020, 11, 5444. [Google Scholar] [CrossRef] [PubMed]
  56. Shaikh, S.; Akhter, N.; Manza, R. Medical Image Processing of Thermal Images in Light of Applied Color Palettes. Int. J. Eng. Adv. Technol. (IJEAT) 2019, 8, 1520–1524. [Google Scholar] [CrossRef]
  57. Gorczyca, M.T.; Milan, H.F.M.; Maia, A.S.C.; Gebremedhin, K.G. Machine learning algorithms to predict core, skin, and hair-coat temperatures of piglets. Comput. Electron. Agric. 2018, 151, 286–294. [Google Scholar] [CrossRef]
  58. Perdomo, C.C.; Kozen, E.A.; Sobestiansky, J.; Silva, A.P.; Silva, C.N.I. Considerações sobre edificações para suínos. 4.,1985. In Curso de Atualização sobre a Produção de Suínos; Embrapa, C., Aves, S.E., Eds.; CNPSA-EMBRAPA: Concórdia, Brazil, 1985. [Google Scholar]
  59. Kiefer, C.; Meignen, B.C.G.; Sanches, J.F.; Carrijo, A.S. Resposta de suínos em crescimento mantidos em diferentes temperaturas. Arch. Zootec. 2009, 58, 55–64. [Google Scholar] [CrossRef]
  60. Crone, C.; Caldara, F.R.; Martins, R.; de Oliveira, G.F.; Marcon, A.V.; Garcia, R.G.; dos Santos, L.S.; Almeida Paz, I.C.L.; Lippi, I.C.D.C.; Burbarelli, M.F.d.C. Environmental Enrichment for Pig welfare during Transport. J. Appl. Anim. Welf. Sci. 2023, 26, 393–403. [Google Scholar] [CrossRef]
  61. Dos Santos, T.C.; Carvalho, C.D.C.S.; da Silva, G.C.D.; Diniz, T.A.; Soares, T.E.; Moreira, S.D.J.M.; Cecon, P.R. Influência do ambiente térmico no comportamento e desempenho zootécnico de suínos. Rev. Ciências Agroveterinárias 2018, 17, 241–253. [Google Scholar] [CrossRef]
  62. Vásquez, N.; Cervantes, M.; Bernal-Barragán, H.; Rodríguez-Tovar, L.E.; Morales, A. Short- and Long-Term Exposure to Heat Stress Differently Affect Performance, Blood Parameters, and Integrity of Intestinal Epithelia of Growing Pigs. Animals 2022, 12, 2529. [Google Scholar] [CrossRef]
  63. Jia, G.; Li, W.; Meng, J.; Tan, H.; Feng, Y. Non-Contact Evaluation of Pigs’ Body Temperature Incorporating Environmental Factors. Sensors 2020, 20, 4282. [Google Scholar] [CrossRef]
  64. Alves, M.D.F.A.; Pandorfi, H.; Montenegro, A.A.D.A.; da Silva, R.A.B.; Gomes, N.F.; Santana, T.C.; de Almeida, G.L.P.; Marinho, G.T.B.; da Silva, M.V.; da Silva, W.A. Evaluation of Body Surface Temperature in Pigs Using Geostatistics. AgriEngineering 2023, 5, 1090–1103. [Google Scholar] [CrossRef]
  65. Sadeghi, M.; Banakar, A.; Minaei, S.; Orooji, M.; Shoushtari, A.; Li, G. Early Detection of Avian Diseases Based on Thermography and Artificial Intelligence. Animals 2023, 13, 2348. [Google Scholar] [CrossRef] [PubMed]
  66. Jaddoa, M.A.; Gonzalez, L.; Cuthbertson, H.; Al-Jumaily, A. Multiview eye localisation to measure cattle body temperature based on automated thermal image processing and computer vision. Infrared Phys. Technol. 2021, 119, 103932. [Google Scholar] [CrossRef]
  67. Saeedi, S.; Rezayi, S.; Keshavarz, H.R.; Niakan Kalhori, S. MRI-based brain tumor detection using convolutional deep learning methods and chosen machine learning techniques. BMC Med. Inform. Decis. Mak. 2023, 23, 16. [Google Scholar] [CrossRef] [PubMed]
  68. McIntyre, L.; Tuba, E. Brain Tumor Segmentation and Classification using Texture Features and Support Vector Machine. In Proceedings of the 11th International Symposium on Digital Forensics and Security (ISDFS), Chattanooga, TN, USA, 11–12 May 2023; pp. 1–5. [Google Scholar]
  69. Yang, X.; Zhao, Y.; Street, G.M.; Huang, Y.; Filip To, S.D.; Purswell, J.L. Classification of broiler behaviours using triaxial accelerometer and machine learning. Animal 2021, 15, 100269. [Google Scholar] [CrossRef] [PubMed]
  70. Andersen, H.M.L.; Jørgensen, E.; Dybkjær, L.; Jørgensen, B. The ear skin temperature as an indicator of the thermal comfort of pigs. Appl. Anim. Behav. Sci. 2008, 113, 43–56. [Google Scholar] [CrossRef]
  71. Hoffer, O.; Rabin, T.; Nir, R.R.; Brzezinski, R.Y.; Zimmer, Y.; Gannot, I. Automated thermal imaging monitors the local response to cervical cancer brachytherapy. J. Biophotonics 2023, 16, e202200214. [Google Scholar] [CrossRef]
  72. Conceição, A.R.; Coeli, A.C.; Braga, P.H.S.; Oliveira, P.D.C.S.; Schultz, E.B. Tecnologias aplicadas ao monitoramento de parâmetros fisiológicos na produção de ruminantes. Rev. Agrar. Acad. 2023, 6, 27–37. [Google Scholar] [CrossRef]
  73. Stewart, M.; Wilson, M.T.; Schaefer, A.L.; Huddart, F.; Sutherland, M.A. The use of infrared thermography and accelerometers for remote monitoring of dairy cow health and welfare. J. Dairy Sci. 2017, 100, 3893–3901. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the proposed algorithm.
Figure 1. Flowchart of the proposed algorithm.
Agriengineering 06 00183 g001
Figure 2. Demonstration of the point grid on the animal’s body.
Figure 2. Demonstration of the point grid on the animal’s body.
Agriengineering 06 00183 g002
Figure 3. Structural similarity index measure.
Figure 3. Structural similarity index measure.
Agriengineering 06 00183 g003
Figure 4. Histogram of the original image (a) and the enhanced image (b).
Figure 4. Histogram of the original image (a) and the enhanced image (b).
Agriengineering 06 00183 g004
Figure 5. Removal of the H channel (a), bilateral filter (b), median filter (c) and histogram equalization (d).
Figure 5. Removal of the H channel (a), bilateral filter (b), median filter (c) and histogram equalization (d).
Agriengineering 06 00183 g005
Figure 6. Original image (a), original grayscale image (b), H channel (c), median filter (d), Otsu binarization (e), isolated pixels (f), inverted binarized image (g) and contour (h).
Figure 6. Original image (a), original grayscale image (b), H channel (c), median filter (d), Otsu binarization (e), isolated pixels (f), inverted binarized image (g) and contour (h).
Agriengineering 06 00183 g006
Figure 7. Original image with two animals (a), original grayscale image (b), H channel (c), median filter (d), Otsu binarization (e), isolated pixels (f), inverted binarized image (g) and contour (h).
Figure 7. Original image with two animals (a), original grayscale image (b), H channel (c), median filter (d), Otsu binarization (e), isolated pixels (f), inverted binarized image (g) and contour (h).
Agriengineering 06 00183 g007
Figure 8. Segmented colors (a), extracted pixels (b), mask (c), and outline (d).
Figure 8. Segmented colors (a), extracted pixels (b), mask (c), and outline (d).
Agriengineering 06 00183 g008
Figure 9. Segmented image (a), gray image (b), mask (c) and outline (d).
Figure 9. Segmented image (a), gray image (b), mask (c) and outline (d).
Agriengineering 06 00183 g009
Figure 10. Average surface temperatures for the air-conditioned environment obtained via the ThermaCam program and proposed algorithm (a). Linear regression (b).
Figure 10. Average surface temperatures for the air-conditioned environment obtained via the ThermaCam program and proposed algorithm (a). Linear regression (b).
Agriengineering 06 00183 g010
Figure 11. Average surface temperatures for the environment without air conditioning (a). Linear regression (b).
Figure 11. Average surface temperatures for the environment without air conditioning (a). Linear regression (b).
Agriengineering 06 00183 g011
Figure 12. Region of highest surface temperature detected by the proposed algorithm.
Figure 12. Region of highest surface temperature detected by the proposed algorithm.
Agriengineering 06 00183 g012
Table 1. Lower and upper values of the HSV space.
Table 1. Lower and upper values of the HSV space.
ColorLower LimitUpper Limit
Yellow10, 100, 10050, 255, 255
Blue100, 100, 100140, 255, 255
Green40, 100, 10080, 255, 255
Red160, 100, 100200, 255, 255
Source: [31].
Table 2. Color intensity and associated temperature.
Table 2. Color intensity and associated temperature.
Color IntensityMatching Temperature (°C)
(15, 0, 15)23.1
(31, 0, 31)23.2
(47, 0, 47)23.4
(63, 0, 63)23.5
Table 3. Measurement metrics for the performance of segmentations.
Table 3. Measurement metrics for the performance of segmentations.
MethodsSaysJaccardPrecision
Otsu0.890.810.87
Color0.900.830.88
Table 4. Performance metrics of the proposed algorithm for the extraction of temperatures from thermal images.
Table 4. Performance metrics of the proposed algorithm for the extraction of temperatures from thermal images.
MetricAir-Conditioned EnvironmentNon-Air-Conditioned Environment
MAE0.2800.25
RMSE0.070.09
R20.960.93
MAE—Mean Absolute Error; RMSE—Root Mean Squared Error; and R2—Coefficient of determination.
Table 5. Descriptive statistics of the surface temperature of pigs.
Table 5. Descriptive statistics of the surface temperature of pigs.
StatisticsAir-Conditioned EnvironmentNatural Environment
Mean30.40–35.70 °C33.02–38.38 °C
Minimum22.80 °C34.10 °C
Maximum38.40 °C42.80 °C
Table 6. SVM classifier performance.
Table 6. SVM classifier performance.
Performance Metrics
PrecisionAccuracyAUC
0.800.911.00
AUC: Area under the curve.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alves, M.d.F.A.; Pandorfi, H.; Soares, R.G.F.; Almeida, G.L.P.d.; Santana, T.C.; Silva, M.V.d. Computational Techniques for Analysis of Thermal Images of Pigs and Characterization of Heat Stress in the Rearing Environment. AgriEngineering 2024, 6, 3203-3226. https://doi.org/10.3390/agriengineering6030183

AMA Style

Alves MdFA, Pandorfi H, Soares RGF, Almeida GLPd, Santana TC, Silva MVd. Computational Techniques for Analysis of Thermal Images of Pigs and Characterization of Heat Stress in the Rearing Environment. AgriEngineering. 2024; 6(3):3203-3226. https://doi.org/10.3390/agriengineering6030183

Chicago/Turabian Style

Alves, Maria de Fátima Araújo, Héliton Pandorfi, Rodrigo Gabriel Ferreira Soares, Gledson Luiz Pontes de Almeida, Taize Calvacante Santana, and Marcos Vinícius da Silva. 2024. "Computational Techniques for Analysis of Thermal Images of Pigs and Characterization of Heat Stress in the Rearing Environment" AgriEngineering 6, no. 3: 3203-3226. https://doi.org/10.3390/agriengineering6030183

Article Metrics

Back to TopTop