Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens
Abstract
:Simple Summary
Abstract
1. Introduction
2. Materials and Methods
2.1. Experiment Materials and Image Acquisition
2.2. Methodology
- Image acquisition (binocular RGB images and thermal infrared images);
- Camera calibration of binocular RGB cameras and a thermal infrared camera;
- Stereo matching for disparity prediction based on binocular RGB images;
- RGB-D semantic segmentation based on the left RGB image and the predicted depth image;
- Image registration of the left RGB image and the thermal infrared image;
- Point clouds fusion between the color point clouds and the thermal infrared point clouds. The color point clouds were reconstructed by the left RGB image and the depth image, and the thermal infrared point clouds were reconstructed by the thermal infrared image and the depth image.
2.2.1. Camera Calibration
2.2.2. Disparity Prediction
2.2.3. RGB-D Semantic Segmentation
2.2.4. Thermal-RGB Image Registration
2.2.5. Three-Dimensional Reconstruction
- (i)
- The RGB and the mask images, the depth and the mask images, and the registered thermal infrared and the mask images, were processed using bitwise and operations [63], respectively;
- (ii)
- The RGB image was combined with the depth image to obtain the target color point clouds, and the registered thermal infrared image was combined with the depth image to obtain the target temperature point clouds;
- (iii)
- The color point clouds and the temperature point clouds were fused to obtain the final RGB-D-T model of chicken by dimensional expansion.
2.3. Feather Damage Detection and Depth Estimation
2.4. Performance Metrics
3. Results and Discussion
3.1. 3D Reconstruction Results of Chicken
3.2. Evaluation of Feather Damage Depth Estimation
3.3. Feather Damage Monitoring Based on the Proposed Method vs. 2D Thermal Infrared Image or RGB Image
3.4. Evaluation of Chicken Disparity Prediction
3.5. Evaluation of Chicken Semantic Segmentation Based on Color and Depth Images
3.6. Time Efficiency Analysis
3.7. Limitations and Future Works
4. Conclusions
- (1)
- A feather damage monitoring system was proposed using binocular RGB cameras and a thermal infrared camera. The depth image of the chicken was predicted using the AANet network based on binocular RGB images. The chicken image was segmented from the background utilizing the RedNet network based on the RGB image and depth image. The RGB image and thermal infrared image were registered by the proposed heterogenous image registration method. Four kinds of images, namely RGB, depth, thermal and mask, were utilized as inputs to reconstruct the 3D model of chicken with RGB-Depth-Thermal maps. The results showed that the deep learning-based AANet network was more efficient than the other three traditional stereo matching algorithms.
- (2)
- Based on the obtained RGB-D-T model, an automated assessment algorithm for the depth of feather damage was developed. The feather damage region was extracted by image pre-processing based on the thermal infrared image. The feather damage region on the 2D images was projected to a unique 3D patch on the 3D point clouds. The depth value was calculated by the difference value between the minimum z-value and the maximum z-value of the target point clouds after filtering. The results showed that the was 0.946, with an RMSE of 2.015 mm between the predicted depth of feather damage and manual measurement.
- (3)
- The feather damage monitoring system for chickens was tested during both day and night. This indicated that the proposed RGB-D-T model was more effective for feather damage detection than the 2D RGB image or thermal infrared image. The results provide ideas for future research on automation and intelligent feather damage monitoring in poultry farming.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
FS | Feather scoring |
IRT | Infrared thermography |
TOF | Time-of-flight |
LiDAR | Structured light cameras or light detection and ranging |
RGB-D-T | RGB-Depth-Thermal |
ICP | Iterative Closest Point |
FCN | Fully convolutional networks |
AANet | Adaptive aggregation network |
AA Modules | Adaptive Aggregation Modules |
ISA | Intra-Scale Aggregation |
CSA | Cross-Scale Aggregation |
RedNet | Residual encoder–decoder network |
ResNet-50 | Residual network with 50 layers |
SGD | Stochastic gradient descent |
MSAC | M-Estimate Sample Consensus |
XOR | Exclusion-or |
Coefficient of determination | |
RMSE | Root mean square error |
PA | Pixel accuracy |
IoU | Intersection-over-union |
MoH | Moderate heat |
MoC | Moderate cold |
SGBM | Semi-Global Block Matching |
AD-Census | Absolute differences measure and census transform |
PMS | PatchMatch Stereo |
References
- Buller, H.; Blokhuis, H.; Jensen, P.; Keeling, L. Towards farm animal welfare and sustainability. Animals 2018, 8, 81. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Xu, D.; Shu, G.; Liu, Y.; Qin, P.; Zheng, Y.; Tian, Y.; Zhao, X.; Du, X. Farm environmental enrichments improve the welfare of layer chicks and pullets: A comprehensive review. Animals 2022, 12, 2610. [Google Scholar] [CrossRef] [PubMed]
- Tauson, R.; Kjaer, J.; Maria, G.; Cepero, R.; Holm, K. Applied scoring of integument and health in laying hens. Anim. Sci. Pap. Rep. 2005, 23, 153–159. [Google Scholar]
- Savory, C.J. Feather pecking and cannibalism. World’s Poult. Sci. J. 1995, 51, 215–219. [Google Scholar] [CrossRef]
- Glatz, P.C. Effect of poor feather cover on feed intake and production of aged laying hens. Asian-Australas. J. Anim. Sci. 2001, 14, 553–558. [Google Scholar] [CrossRef]
- Bilcik, B.; Keeling, L.J. Changes in feather condition in relation to feather pecking and aggressive behaviour in laying hens. Br. Poult. Sci. 1999, 40, 444–451. [Google Scholar] [CrossRef]
- Spindler, B.; Weseloh, T.; Esser, C.; Freytag, S.K.; Klambeck, L.; Kemper, N.; Andersson, R. The effects of UV-A light provided in addition to standard lighting on plumage condition in laying hens. Animals 2020, 10, 1106. [Google Scholar] [CrossRef]
- Kjaer, J.B.; Sorensen, P. Feather pecking and cannibalism in free-range laying hens as affected by genotype, dietary level of methionine + cystine, light intensity during rearing and age at first access to the range area. Appl. Anim. Behav. Sci. 2002, 76, 21–39. [Google Scholar] [CrossRef]
- Savory, C.J.; Mann, J.S.; Macleod, M.G. Incidence of pecking damage in growing bantams in relation to food form, group size, stocking density, dietary tryptophan concentration and dietary protein source. Br. Poult. Sci. 1999, 40, 579–584. [Google Scholar] [CrossRef]
- Savory, C.J.; Mann, J.S. Feather pecking in groups of growing bantams in relation to floor litter substrate and plumage colour. Br. Poult. Sci. 1999, 40, 565–572. [Google Scholar] [CrossRef]
- El-Lethey, H.; Aerni, V.; Jungi, T.W.; Wechsler, B. Stress and feather pecking in laying hens in relation to housing conditions. Br. Poult. Sci. 2000, 41, 22–28. [Google Scholar] [CrossRef] [PubMed]
- Van de Weerd, H.A.; Elson, A. Rearing factors that influence the propensity for injurious feather pecking in laying hens. World’s Poult. Sci. J. 2006, 62, 654–664. [Google Scholar]
- Newberry, R.C.; Keeling, L.J.; Estevez, I.; Bilcik, B. Behaviour when young as a predictor of severe feather pecking in adult laying hens: The redirected foraging hypothesis revisited. Appl. Anim. Behav. Sci. 2007, 107, 262–274. [Google Scholar] [CrossRef]
- Nichelmann, M.; Baranyiova, E.; Goll, R.; Tzschentke, B. Influence of feather cover on heat balance in laying hens (Gallus domesticus). J. Therm. Biol. 1986, 11, 121–126. [Google Scholar] [CrossRef]
- Decina, C.; Berke, O.; van Staaveren, N.; Baes, C.F.; Harlander-Matauscheck, A. Development of a scoring system to assess feather damage in canadian laying hen flocks. Animals 2019, 9, 436. [Google Scholar] [CrossRef] [Green Version]
- Van Staaveren, N.; Ellis, J.; Baes, C.F.; Harlander-Matauschek, A. A meta-analysis on the effect of environmental enrichment on feather pecking and feather damage in laying hens. Poult. Sci. 2021, 100, 397–411. [Google Scholar] [CrossRef]
- Cronin, G.M.; Glatz, P.C. Causes of feather pecking and subsequent welfare issues for the laying hen: A review. Anim. Prod. Sci. 2021, 61, 990–1005. [Google Scholar] [CrossRef]
- Tauson, R.; Ambrosen, T.; Elwinger, K. Evaluation of procedures for scoring the integument of laying hens—Independent scoring of plumage condition. Acta Agric. Scand. 1984, 34, 400–408. [Google Scholar] [CrossRef]
- Cook, N.J.; Smykot, A.B.; Holm, D.E.; Fasenko, G.; Church, J.S. Assessing feather cover of laying hens by infrared thermography. J. Appl. Poult. Res. 2006, 15, 274–279. [Google Scholar] [CrossRef]
- Zhao, Y.; Xin, H.; Dong, B. Use of infrared thermography to assess laying-hen feather coverage. Poult. Sci. 2013, 92, 295–302. [Google Scholar] [CrossRef]
- Mota-Rojas, D.; Titto, C.G.; de Mira Geraldo, A.; Martinez-Burnes, J.; Gomez, J.; Hernandez-avalos, I.; Casas, A.; Dominguez, A.; Jose, N.; Bertoni, A.; et al. Efficacy and function of feathers, hair, and glabrous skin in the thermoregulation strategies of domestic animals. Animals 2021, 11, 3472. [Google Scholar] [CrossRef] [PubMed]
- Cilulko, J.; Janiszewski, P.; Bogdaszewski, M.; Szczygielska, E. Infrared thermal imaging in studies of wild animals. Eur. J. Wildl. Res. 2013, 59, 17–23. [Google Scholar] [CrossRef]
- Redaelli, V.; Ludwig, N.; Costa, L.N.; Crosta, L.; Riva, J.; Luzi, F. Potential application of thermography (irt) in animal production and for animal welfare. A case report of working dogs. Ann. Ist. Super. Sanita 2014, 50, 147–152. [Google Scholar] [PubMed]
- Casas-Alvarado, A.; Mota-Rojas, D.; Hernandez-Avalos, I.; Mora-Medina, P.; Olmos-Hernandez, A.; Verduzco-Mendoza, A.; Reyes-Sotelo, B.; Martinez-Burnes, J. Advances in infrared thermography: Surgical aspects, vascular changes, and pain monitoring in veterinary medicine. J. Therm. Biol. 2020, 92, 102664. [Google Scholar] [CrossRef]
- McCafferty, D.J. Applications of thermal imaging in avian science. Ibis 2013, 155, 4–15. [Google Scholar] [CrossRef] [Green Version]
- Barreto, C.D.; Alves, F.V.; Ramos, C.; Leite, M.C.D.; Leite, L.C.; Karvatte, N. Infrared thermography for evaluation of the environmental thermal comfort for livestock. Int. J. Biometeorol. 2020, 64, 881–888. [Google Scholar] [CrossRef] [PubMed]
- Giersberg, M.F.; Spindler, B.; Kemper, N. Assessment of plumage and integument condition in dual-purpose breeds and conventional layers. Animals 2017, 7, 97. [Google Scholar] [CrossRef] [Green Version]
- Nääs, I.d.A.; Bites Romanini, C.E.; Neves, D.P.; do Nascimento, G.R.; Vercellino, R.d.A. Broiler surface temperature distribution of 42 day old chickens. Sci. Agric. 2010, 67, 497–502. [Google Scholar] [CrossRef] [Green Version]
- Pichová, K.; Bilcik, B.; Kost’al, L. Assessment of the effect of housing on feather damage in laying hens using IR thermography. Animal 2017, 11, 661–669. [Google Scholar] [CrossRef] [Green Version]
- Del Valle, J.E.; Pereira, D.F.; Mollo Neto, M.; Almeida Gabriel Filho, L.R.; Salgado, D.D. Unrest index for estimating thermal comfort of poultry birds (Gallus gallus domesticus) using computer vision techniques. Biosyst. Eng. 2021, 206, 123–134. [Google Scholar] [CrossRef]
- Okinda, C.; Lu, M.; Liu, L.; Nyalala, I.; Muneri, C.; Wang, J.; Zhang, H.; Shen, M. A machine vision system for early detection and prediction of sick birds: A broiler chicken model. Biosyst. Eng. 2019, 188, 229–242. [Google Scholar] [CrossRef]
- Pereira, D.F.; Miyamoto, B.C.B.; Maia, G.D.N.; Sales, G.T.; Magalhaes, M.M.; Gates, R.S. Machine vision to identify broiler breeder behavior. Comput. Electron. Agric. 2013, 99, 194–199. [Google Scholar] [CrossRef]
- Xiao, L.; Ding, K.; Gao, Y.; Rao, X. Behavior-induced health condition monitoring of caged chickens using binocular vision. Comput. Electron. Agric. 2019, 156, 254–262. [Google Scholar] [CrossRef]
- Paturkar, A.; Sen Gupta, G.; Bailey, D. Making use of 3D models for plant physiognomic analysis: A review. Remote Sens. 2021, 13, 2232. [Google Scholar] [CrossRef]
- Susperregi, L.; Maria Martinez-Otzeta, J.; Ansuategui, A.; Ibarguren, A.; Sierra, B. Rgb-d, laser and thermal sensor fusion for people following in a mobile robot. Int. J. Adv. Robot. Syst. 2013, 10, 271. [Google Scholar] [CrossRef]
- Krishnan, A.K.; Saripalli, S. Cross-calibration of rgb and thermal cameras with a lidar for rgb-depth-thermal mapping. Unmanned Syst. 2017, 5, 59–78. [Google Scholar] [CrossRef]
- Chang, M.-C.; Yu, T.; Luo, J.; Duan, K.; Tu, P.; Zhao, Y.; Nagraj, N.; Rajiv, V.; Priebe, M.; Wood, E.A.; et al. Multimodal sensor system for pressure ulcer wound assessment and care. IEEE Trans. Ind. Inform. 2018, 14, 1186–1196. [Google Scholar] [CrossRef]
- Shi, Y.; Payeur, P.; Frize, M.; Bariciak, E. Thermal and rgb-d imaging for necrotizing enterocolitis detection. In Proceedings of the 2020 IEEE International Symposium on Medical Measurements and Applications, Virtual, 1 June–1 July 2020; pp. 1–6. [Google Scholar]
- Lagüela, S.; Armesto, J.; Arias, P.; Herraez, J. Automation of thermographic 3D modelling through image fusion and image matching techniques. Autom. Constr. 2012, 27, 24–31. [Google Scholar] [CrossRef]
- Ham, Y.; Golparvar-Fard, M. An automated vision-based method for rapid 3D energy performance modeling of existing buildings using thermal and digital imagery. Adv. Eng. Inform. 2013, 27, 395–409. [Google Scholar] [CrossRef]
- Seitz, S.M.; Curless, B.; Diebel, J.; Scharstein, D.; Szeliski, R. A comparison and evaluation of multi-view stereo reconstruction algorithms. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New York, NY, USA, 17–22 June 2006; pp. 519–528. [Google Scholar]
- Hoegner, L.; Tuttas, S.; Xu, Y.; Eder, K.; Stilla, U. Evaluation of methods for coregistration and fusion of RPAS-based 3D point clouds and thermal infrared images. ISPRS—Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B3, 241–246. [Google Scholar] [CrossRef] [Green Version]
- Besl, P.J.; Mckay, N.D. A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 1992, 14, 239–256. [Google Scholar] [CrossRef] [Green Version]
- Di Stefano, L.; Marchionni, M.; Mattoccia, S. A fast area-based stereo matching algorithm. Image Vis. Comput. 2004, 22, 983–1005. [Google Scholar] [CrossRef]
- Wu, F.; Fang, X. An improved RANSAC homography algorithm for feature based image mosaic. In Proceedings of the 7th WSEAS International Conference on Signal Processing, Computational Geometry & Artificial Vision, Athens, Greece, 24–26 August 2007; pp. 202–207. [Google Scholar]
- Lowe, D.G. Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2004, 60, 91–110. [Google Scholar] [CrossRef]
- Long, J.; Shelhamer, E.; Darrell, T. Fully convolutional networks for semantic segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 7–12 June 2015; pp. 3431–3440. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-Net: Convolutional networks for biomedical image segmentation. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Munich, Germany, 5–9 October 2015; pp. 234–241. [Google Scholar]
- Badrinarayanan, V.; Kendall, A.; Cipolla, R. Segnet: A deep convolutional encoder-decoder architecture for image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 2481–2495. [Google Scholar] [CrossRef] [PubMed]
- Chen, L.-C.; Papandreou, G.; Kokkinos, I.; Murphy, K.; Yuille, A.L. Deeplab: Semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 40, 834–848. [Google Scholar] [CrossRef] [Green Version]
- Park, S.J.; Hong, K.S.; Lee, S. Rdfnet: Rgb-d multi-level residual feature fusion for indoor semantic segmentation. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 4980–4989. [Google Scholar]
- Zhang, Z.Y. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef] [Green Version]
- Li, H.S.; Cao, Z.J. Matlab codes of subset simulation for reliability analysis and structural optimization. Struct. Multidisc. Optim. 2016, 54, 391–410. [Google Scholar] [CrossRef]
- Xu, H.; Zhang, J. AANet: Adaptive aggregation network for efficient stereo matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 1959–1968. [Google Scholar]
- Gu, X.; Fan, Z.; Zhu, S.; Dai, Z.; Tan, F.; Tan, P. Cascade cost volume for high-resolution multi-view stereo and stereo matching. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 2495–2504. [Google Scholar]
- Ilg, E.; Mayer, N.; Saikia, T.; Keuper, M.; Dosovitskiy, A.; Brox, T. Flownet 2.0: Evolution of optical flow estimation with deep networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2462–2470. [Google Scholar]
- Chabra, R.; Straub, J.; Sweeney, C.; Newcombe, R.; Fuchs, H. Stereodrnet: Dilated residual stereonet. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 16–19 June 2019; pp. 11786–11795. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
- Jiang, J.; Zheng, L.; Luo, F.; Zhang, Z. Rednet: Residual encoder-decoder network for indoor rgb-d semantic segmentation. arXiv 2018, arXiv:1806.01054. [Google Scholar]
- Smith, P.R. Bilinear interpolation of digital images. Ultramicroscopy 1981, 6, 201–204. [Google Scholar] [CrossRef]
- Li, H.; Ding, W.; Cao, X.; Liu, C. Image registration and fusion of visible and infrared integrated camera for medium-altitude unmanned aerial vehicle remote sensing. Remote Sens. 2017, 9, 441. [Google Scholar] [CrossRef] [Green Version]
- Torr, P.H.S.; Murray, D.W. The development and comparison of robust methods for estimating the fundamental matrix. Int. J. Comput. Vis. 1997, 24, 271–300. [Google Scholar] [CrossRef]
- Seshadri, V.; Hsieh, K.; Boroum, A.; Lee, D.; Kozuch, M.A.; Mutlu, O.; Gibbons, P.B.; Mowry, T.C. Fast bulk bitwise AND and OR in DRAM. IEEE Comput. Archit. Lett. 2015, 14, 127–131. [Google Scholar] [CrossRef]
- Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man. Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
- Patel, S.; Ramzan, Z.; Sundaram, G.S. Luby-Racko. Ciphers: Why XOR is not so exclusive. In Proceedings of the International Workshop on Selected Areas in Cryptography, St. John’s, NL, Canada, 15–16 August 2002; pp. 271–290. [Google Scholar]
- Dziubich, T.; Szymanski, J.; Brzeski, A.; Cychnerski, J.; Korlub, W. Depth images filtering in distributed streaming. Polish Marit. Res. 2016, 23, 91–98. [Google Scholar] [CrossRef] [Green Version]
- Bloch, V.; Barchilon, N.; Halachmi, I.; Druyan, S. Automatic broiler temperature measuring by thermal camera. Biosyst. Eng. 2020, 199, 127–134. [Google Scholar] [CrossRef]
- Hirschmueller, H. Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 2008, 30, 328–341. [Google Scholar] [CrossRef]
- Mei, X.; Sun, X.; Zhou, M.; Jiao, S.; Wang, H.; Zhang, X. On building an accurate stereo matching system on graphics hardware. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Barcelona, Spain, 6–13 November 2011; pp. 467–474. [Google Scholar]
- Bleyer, M.; Rhemann, C.; Rother, C. PatchMatch stereo-stereo matching with slanted support windows. In Proceedings of the British Machine Vision Conference, Dundee, Scotland, 30 August–1 September 2011; pp. 1–11. [Google Scholar]
- Russell, B.C.; Torralba, A.; Murphy, K.P.; Freeman, W.T. LabelMe: A database and web-based tool for image annotation. Int. J. Comput. Vis. 2008, 77, 157–173. [Google Scholar] [CrossRef]
- Guo, Y.; Aggrey, S.E.; Oladeinde, A.; Johnson, J.; Zock, G.; Chai, L. A machine vision-based method optimized for restoring broiler chicken images occluded by feeding and drinking equipment. Animals 2021, 11, 123. [Google Scholar] [CrossRef]
- Veeramani, B.; Raymond, J.W.; Chanda, P. DeepSort: Deep convolutional networks for sorting haploid maize seeds. BMC Bioinformatics 2018, 19, 85–93. [Google Scholar] [CrossRef]
Environment | Head Area (°C) | Back Area (°C) | Tail Area (°C) |
---|---|---|---|
MoH * | 43.6 ± 0.9 | 39.2 ± 1.2 | 38.8 ± 1.0 |
MoC * | 39.5 ± 1.9 | 35.4 ± 1.5 | 34.6 ± 1.3 |
Methods | PA | IoU | Model Size (MB) | Inference Speed (fps) |
---|---|---|---|---|
FCN | 0.840 | 0.839 | 77 | 21.277 |
U-Net | 0.921 | 0.919 | 30 | 21.739 |
RedNet | 0.997 | 0.978 | 313 | 17.857 |
Subtasks | Average Time (s) | Standard Deviation (s) |
---|---|---|
Disparity prediction | 0.491 | 0.122 |
Semantic segmentation | 0.056 | 0.008 |
Image registration | 0.020 | 0.005 |
3D reconstruction | 0.060 | 0.014 |
Total | 0.627 | 0.149 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, X.; Zhang, Y.; Geng, J.; Pan, J.; Huang, X.; Rao, X. Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens. Animals 2023, 13, 126. https://doi.org/10.3390/ani13010126
Zhang X, Zhang Y, Geng J, Pan J, Huang X, Rao X. Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens. Animals. 2023; 13(1):126. https://doi.org/10.3390/ani13010126
Chicago/Turabian StyleZhang, Xiaomin, Yanning Zhang, Jinfeng Geng, Jinming Pan, Xinyao Huang, and Xiuqin Rao. 2023. "Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens" Animals 13, no. 1: 126. https://doi.org/10.3390/ani13010126
APA StyleZhang, X., Zhang, Y., Geng, J., Pan, J., Huang, X., & Rao, X. (2023). Feather Damage Monitoring System Using RGB-Depth-Thermal Model for Chickens. Animals, 13(1), 126. https://doi.org/10.3390/ani13010126