Next Article in Journal
Effects of Nitrogen Addition on Soil Microbial Biomass: A Meta-Analysis
Previous Article in Journal
Genome-Wide Association Study Reveals Loci and New Candidate Gene Controlling Seed Germination in Rice
Previous Article in Special Issue
YOLOv8-Pearpollen: Method for the Lightweight Identification of Pollen Germination Vigor in Pear Trees
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on an Identification and Grasping Device for Dead Yellow-Feather Broilers in Flat Houses Based on Deep Learning

1
College of Engineering, Nanjing Agricultural University, Nanjing 210031, China
2
College of Artificial Intelligence, Nanjing Agricultural University, Nanjing 210031, China
3
School of Electrical and Control Engineering, Xuzhou University of Technology, Xuzhou 221018, China
4
Faculty of Engineering and Information Technology, University of Technology Sydney, Sydney, NSW 2007, Australia
*
Author to whom correspondence should be addressed.
Agriculture 2024, 14(9), 1614; https://doi.org/10.3390/agriculture14091614
Submission received: 24 June 2024 / Revised: 9 September 2024 / Accepted: 10 September 2024 / Published: 14 September 2024

Abstract

:
The existence of dead broilers in flat broiler houses poses significant challenges to large-scale and welfare-oriented broiler breeding. To ensure the timely identification and removal of dead broilers, a mobile device based on visual technology for grasping them was meticulously designed in this study. Among the multiple recognition models explored, the YOLOv6 model was selected due to its exceptional performance, attaining an impressive 86.1% accuracy in identification. This model, when integrated with a specially designed robotic arm, forms a potent combination for effectively handling the task of grasping dead broilers. Extensive experiments were conducted to validate the efficacy of the device. The results reveal that the device achieved an average grasping rate of dead broilers of 81.3%. These findings indicate that the proposed device holds great potential for practical field deployment, offering a reliable solution for the prompt identification and grasping of dead broilers, thereby enhancing the overall management and welfare of broiler populations.

1. Introduction

As a domesticated poultry type with a long history, broilers (broiler chickens) are highly popular globally. Broilers can provide economical and nutritious eggs. Additionally, broilers have advantages such as high protein content, low fat levels, and low calorie counts that cannot be compared to pork and beef [1]. In recent years, the development of flat farming at the scale for broilers has been driven by the increasing demand for low-fat and high-protein broilers [2]. The flat breeding mode refers to directly raising a flock of broilers on the ground or a floor composed of mesh [3]. Broilers raised in this mode have full mobility, high bone strength, and good meat quality [4].
Identifying and clearing dead broilers is a time-consuming and laborious task. Moreover, a large amount of auditory input and cognitive information processing will compete with the visual search, leading to a decrease in attention when performing a single repetitive task [5]. Secondly, the breeding environment contains a large amount of gasses that are harmful to humans, such as NH3, H2S, and CO [6], as well as a large amount of dust [7]. Both of these factors are detrimental to human health, particularly for individuals who work in these environments. Finally, as the inspections have to be performed between different broiler coops [8], the risks of cross-infection between different broiler coops and disease transmission between poultry and mammals will be increased when identifying and cleaning the dead broilers manually. Therefore, intelligent dead broiler removal devices can be used to mitigate the abovementioned drawbacks.
The early identification of dead broilers mainly relies on traditional methods through features such as broiler voice, body temperature, and posture [9,10,11,12]. With the continuous advancement of technology, deep learning and visual technologies have been implemented to identify the conditions of livestock and poultry [13]. Visual technology has the advantages of high efficiency and can identify the conditions of broilers without touching them [14]. Deep learning technology learns through massive amounts of data [15], further improving the accuracy and efficiency of recognition.
A number of studies have proposed methods to recognize dead broilers based on the different characteristics of chickens. Lu et al. monitored the colors of chicken crowns using machine vision technology to identify dead broilers [16]; however, this required three consecutive even shakes of the coop, which could cause stress behaviors in yellow-feathered broilers. Zheng et al. used machine vision to judge the posture of feeding laying hens to realize the classification of sick and dead hens [17]; however, this method is prone to misidentification when the posture of healthy hens is abnormal or the camera capture position is shifted. Qu et al. proposed an algorithm based on LibSVM visual detection of dead broilers through judging the morphological features of the chicken claws [18], but this detection method overly relied on the chicken claws and was prone to misjudgment. Veera et al. identified dead broilers using infrared thermal images and the contours of broilers [19], but the algorithm was not suitable for the classification of day-old (after nine weeks) chickens or when there was a high density of dead chickens. All of these studies have proven valuable for subsequent applications, but their dead chicken detection algorithms generally rely on the environment and a single chicken feature.
In terms of dead chicken removal devices, research and applications have shown great promise. In the 1980s, robots began to be deployed in the field [20] to perceive environmental information and achieve robotic movement. Liu et al. designed a visual technology-based dead broiler removal device [21]. The stainless-steel plates on both sides of the front end of the device “swept” the dead broilers onto the conveyor belt, which transported the dead broilers to the storage warehouse at the back end. However, the slow movement of the device made it difficult to clean up dead chickens from other locations of the flat chicken house. Hu et al. designed a dead broiler picking actuator with three joints and four fingers, based on the underactuated principle, for dead broilers raised in captivity for 3 to 7 weeks [22]. However, the device was more complex in terms of construction and had not been applied in the poultry industry. Zhao et al. conducted in-depth research on the kinematic characteristics of a five-degrees-of-freedom dead broiler-picking robot arm. Simulation analysis was conducted using the MATLAB 2020a software [23], which provided preliminary theoretical support for the design of subsequent control systems. However, this research was only at the theoretical level and did not yet include actual construction and experimental validation of the robotic arm.
Research on dead broilers has primarily concentrated on cage-raised broilers, with scant attention paid to the recognition of dead broilers in free-range scenarios. Furthermore, the ability to extract the features and details of dead chicken images is relatively low. In terms of devices, despite the relatively comprehensive theoretical exploration regarding the structures and devices for picking up dead broilers, the majority of studies have been confined to the theoretical level. In view of this, this study designed a simple, easy-to-use, and cost-effective dead broiler grasping and moving device, with an average success rate of 81.3%. In addition, this study proposed an enhanced deep learning method for recognizing dead broilers based on YOLOv6n (hereinafter referred to as YOLOv6), solving the problems of missing image details and missed detection in dense situations.

2. Materials and Methods

2.1. Experimental Base

The experiment was performed in a large-scale free-range yellow-feather broiler farm at Jinniuhu Street, Luhe District, Nanjing City, Jiangsu Province, China (118.52.38° E, 32.26.54° N). Each broiler was about 15 weeks old, weighing about 1.45 kg, and had a chest width of about 12.95 cm. The interior scene of the experimental broiler house is shown in Figure 1. The data were collected from 8 March to 28 March 2023, and the testing experiment was conducted from 8 March to 12 March 2024.

2.2. Moving Chassis

The mobile chassis was a R550 (AKM) PLUS chassis (Wheel Technology Co., LTD., Dongguan, China) with a depth camera and suspended Ackerman structure carrying a laser radar, and it could realize the functions of mapping navigation, obstacle avoidance, sound source location, wireless communication with the mechanical arm, and image acquisition as well as a processing subsystem. Figure 2 shows the structure of the dead broiler identification and grabbing device.

2.3. Image Acquisition and Processing System

The image acquisition and processing subsystem is composed of a binocular camera and Jetson Nano (NVIDIA, Santa Clara, CA, USA). It identifies dead broilers and their coordinates in scattered broiler farms in real-time via binocular vision positioning. The camera has 5 million pixels with a video resolution of 1920 × 1080. The Jetson Nano is a compact and feature-rich AI computing module developed by NVIDIA, equipped with a 128-core Maxwell GPU [24] and embedded with a trained recognition model for dead broilers. Figure 3 shows the image acquisition and processing system.
Binocular vision positioning utilizes the principle of stereo vision to construct a three-dimensional model of the scene, thereby determining the specific position of the target in space [25]. The imaging model is shown in Figure 4.
Equation (1) was derived from similar triangles.
Z = f × b d X = Z × μ L f ,
where Z is the normal distance from the chicken to the baseline (b) of the camera, X refers to the lateral distance from the chicken to the center of the camera, f refers to the camera focal length, and d is the binocular camera parallax, with a numerical value of μL + μR.

2.4. Mechanical Arm

The dead broiler grasping manipulator was based on the Dobot manipulator (Dobot Technology Co., LTD., Shenzhen, China), and the model and parameters of the manipulator and motor were improved according to the specifications of broilers. The material was 3D-printed resin, and the control system communicated with Jetson Nano through Bluetooth for the sake of data transmission and instruction interaction. The maximum load was 5 kg, and the repetitive positioning was 0.2 mm. Physical diagrams of the manipulator and the end effector are shown in Figure 5.

2.5. Camera Calibration and Hand–Eye Calibration

In order to realize the transformation from the real-world coordinate system to the manipulator’s coordinate system, camera calibration and hand–eye calibration were needed [26]. According to the principle of camera imaging, the camera converts the actual 3D chicken image into 2D information to infer three-dimensional chicken information from two-dimensional chicken image information, it was necessary to calibrate the camera and locate the dead chicken’s spatial position. In addition, the mobile chassis and robotic arm control system functioned based on the position of the dead chicken, while the binocular camera was not fixed on the robotic arm, and its position could also change. To ensure that the image information captured by the camera matched the device control system, it was necessary to calibrate the camera coordinate system and the robotic arm base coordinate system uniformly to achieve accurate visual positioning. This process is commonly referred to as hand–eye calibration, and the schematic diagram of the relationship between the three is shown in Figure 6.
In camera calibration and hand–eye calibration operations, calibration plates such as a checkerboard or solid circular arrays are usually used as auxiliary calibration objects. In this experiment, a checkerboard calibration plate was selected, as shown in Figure 7a. The checkerboard calibration board comprised a 10 × 7 checkerboard, where the size of each square was 26 mm × 26 mm. The camera calibration result is shown in Figure 7b.
In camera calibration, checkerboard or solid circular array calibration boards are usually used as auxiliary calibration objects [27]. In this experiment, a checkerboard calibration board was selected. The process was as follows: First, a world coordinate system was established on the calibration board, and the corners of the calibration board were touched by the end of the robotic arm, as shown in Figure 8. As the world coordinate system was manually set on the calibration board, the positions of these corners in the world coordinate system were known. We used the position information of the above corners to solve the transformation matrix between the world coordinate system ({World}) and the robotic arm coordinate system ({Base}) [26,28]. The transformation matrix T B W from the robotic arm coordinate system to the world coordinate system is shown in Equation (2).
T B W = R B W P 0 1 ,
where R B W is the rotation matrix and P is the translation transformation vector.
R B W and P can be represented by the position of the origin of the world coordinate system in the coordinate system of the robotic arm. The solution of the rotation matrix R B W can be achieved by introducing a transition matrix, which translates the coordinate system of the robotic dead chicken grasping arm to coincide with the origin of the world coordinate system, as shown in Figure 9 [29].
The coordinate system of the robotic arm can obtain the transition matrix ({Transition}) through the translation vector, as shown in Equation (3). Since the origin of the transition coordinate system is consistent with the origin of the world coordinate system, when the end of the robotic arm contacts the origin, the specific position of the world coordinate system in the robotic arm coordinate system can be determined. The translation vector P can then be calculated.
x t y t z t = x b y b z b + P ,
where xb, yb, and zb are the three-dimensional coordinates in the {base} coordinate system, and xt, yt, and zt are the three-dimensional coordinates in the {tool} coordinate system.
By rotating the R vector in the transition coordinate system, the coordinate system of the robotic arm can be obtained, as shown in Equation (4).
x b y b z b = R × x t y t z t = a 11 a 12 a 13 a 21 a 22 a 23 a 31 a 32 a 33

2.6. Data Collection and Image Data Set Production

The quality and quantity of data play a vital role in the effective application of deep learning algorithms. High-quality data can significantly improve the performance of deep learning models [30]. Dead yellow-feather broilers exhibit tightly closed eyes, bodies pressed to the ground, or weak, stiff bodies. The other behavior states of yellow-feather broilers in scattered broiler farms include eating, lying down, and walking [31]. The different behavior states of broilers are shown in Table 1.
Among these, “Dead” broilers need to be grabbed. “Others” need not be grabbed. A schematic of the behavior is shown in Figure 10.
After screening, 1565 original images were obtained. The images were labeled using LabelMe 3.16.7 with the label categories shown in Table 2 and converted to PASCAL VOC format for subsequent use. More than one behavior of yellow-feathered chickens could be seen in each image. After image preprocessing (resizing, random flipping, and translation transformation), 2456 images were obtained, which were divided into training, verification, and test sets according to a ratio of 8:1:1.

2.7. Recognition Model of Dead Yellow-Feather Broiler

Existing target detection algorithms based on deep learning are divided into single-stage or two-stage [32]. To select a suitable network model, three mainstream object detection algorithms—namely, SSD, Faster-RCNN, and YOLOv6—were trained and predicted using the same data set, and the model with the best comprehensive performance was selected. The deep learning methods used and their advantages and disadvantages are detailed in Table 3.
It can be seen from Table 3 that the Faster-RCNN model is more complex to train and is not easy to migrate to the image acquisition and processing system. However, YOLOv6 has the advantages of being lightweight, easy to migrate, and having low requirements for the development environment [33]. Therefore, the network model for detecting dead broilers was improved based on the YOLOv6 model.
The Squeeze-and-Excitation (SE) attention mechanism is a lightweight module designed to enhance the representational power of convolutional neural networks, as shown in Figure 11. It improves the network’s characterization ability by explicitly capturing the correlation between convolution channels [34].
The improved YOLOv6 network model is shown in Figure 12. The SE compresses feature maps with a scale of H × W × C through global average pooling, retaining only the size of C at the channel scale, which is converted to 1 × 1 × C. After compression, ReLU and restoration are performed and, finally, sigmoid is used for activation. The values of each channel are converted into weight values and multiplied, with the original input features as input features.

2.7.1. Model Training Parameters

The hardware platform used for this training was configured with a Tesla V100 graphics card with 24 G of memory and an AMD EPYC 9654 CPU. The Python version is 3.8, and the PyTorch framework version is 1.9.0.

2.7.2. Model Evaluation Metrics

When evaluating the performance of deep learning algorithms, the following key indicators are used: precision, which is used to measure the proportion of real positive samples in instances where the model predicts positive samples; recall, which reflects the ability of the model to find all positive samples; and mean average precision (maP), which reflects the performance of the comprehensive evaluation model in various categories of multi-category classification tasks. The F1 score is the harmonic mean of precision and recall. Together, these indicators constitute a standard system for comprehensively evaluating the performance of deep learning algorithms [35]. Their formulas are shown in Equations (5)–(8):
P r e c i s i o n = T P T P + F P ,
R e c a l l = T P T P + F N ,
m a P = 1 N i = 1 N A p i ,
F 1 = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
where TP is the number of true positive samples, FP is the number of false positive samples, FN is the number of false negatives, N represents the number of target classes detected, A is the corresponding accuracy, and pi is the change value of the recall.
The FPS (frame rate per second of the screen) is also used to measure the processing speed of the model in practical applications [36]. The larger the FPS value, the faster the model detection speed.

2.8. The Design of the Dead Broiler Grasping Manipulator

The Robotics Toolbox module in MATLAB provides powerful functions that can be used to simulate the kinematics and trajectory planning of the manipulator [37]. The manipulator model was established, and the posture of the end effector of the manipulator in the base coordinate system was calculated by the “forward_kinematics” function [38]. The D-H parameters of the manipulator were constructed, and the simulation model of the manipulator is shown in Figure 13.
Path planning for the manipulator was realized via the quintic polynomial interpolation method. The rationality of the quintic polynomial interpolation method was verified by Robotics Toolbox, and the resulting diagram is shown in Figure 14.

3. Results and Analysis

3.1. The Different Model’s Performance

After the model was trained and the parameters were adjusted, the final evaluation was conducted using the test set. The overall results for the category of yellow-feathered broilers using the three algorithms SSD, Faster-RCNN, and YOLOv6 are shown in Table 4 and Figure 15. The standard errors of Faster-RCNN and YOLOv6 are shown in Table 5.
It can be seen from Table 4 and Table 5 and Figure 15 that the detection accuracy of YOLOv6 is the same as that of Faster-RCNN. However, the YOLOv6 has lower standard errors of precision and recall, which means that YOLOv6 has a low degree of dispersion. The YOLOv6 and SSD algorithms show obvious advantages in terms of running speed, which are related to their one-stage models [39,40], among which YOLOv6 has the fastest detection speed, as shown in Table 3.
Although YOLOv6 achieves good detection, YOLOv6 cannot identify broilers in the distance if the image background is complex, as shown in Figure 16. It can also be seen from Figure 16 that broiler state behavior in the upper left corner is not ideal, leading to misidentification.
We decided to optimize the YOLOv6 model to improve its performance. The rationality of the SE module was verified through ablation experiments, as shown in Table 6, and Figure 17 and Figure 18.
As can be seen from Table 6, Figure 17 and Figure 18, after adding the CBAM and SE attention modules to YOLOv6, different levels of improvement were yielded in maP, F1-score, precision, and recall. The model with the SE attention mechanism incorporated achieved the best results in the recognition of the death label category. Due to the great increase in the number of parameters after the introduction of the CBAM module, although the difference from the SE module in terms of detection accuracy was small, it was significantly less than the difference from the more lightweight SE module in terms of operation speed.
In order to enable the YOLOv6 + SE network model to deal with larger pictures and videos of henhouse scenes, and to improve the global perception, this study improved YOLOv6 + SE based on ASPP (Atrous Spatial Pyramid Pooling). ASPP leverages atrous convolutions with different dilation rates to extract multi-scale features without losing resolution or significantly increasing the computational cost. Through applying parallel Atrous convolutions with varying dilation rates, ASPP can effectively aggregate contextual information at multiple scales. The performance comparison between original and improved modes is shown in Table 7 and Figure 19. The recognition result of the improved YOLOv6 + SE is shown in Figure 20.
Comparing Figure 20 with Figure 16, it can be seen that the improved YOLOv6 significantly increased the number of broilers identified in different behavior states, proving that the ASPP module is able to reduce a certain amount of the leakage in the detection of broiler states in complex backgrounds. At the same time, the introduction of ASPP can also improve the iteration speed and detection speed to a certain extent, which makes up for the problem of the increased number of parameters brought about by the attention mechanism. The curves of loss that the training models suffered during training are shown in Figure 21.

3.2. The Real-Time Detection Effect of the Model

The improved YOLOv6 network model was migrated to Jetson Nano, and real-time detection was performed by connecting a display screen. The result is shown in Figure 22.
The red box displays the results, the green box displays in the form of a video, and the yellow box shows the behavior states of different broilers detected and their detection time. Through this experiment, it can be determined that the improved network model was successfully migrated to Jetson Nano, and the real-time identification effect is good.

3.3. The Design of the Dead Broiler Grasping Experiment

In this experiment, first, a dead broiler was placed at a point where the mobile device could easily move to and grasp it. Secondly, before being turned on, the device was placed where it could observe the dead broiler. Finally, the device was turned on and operated. The effect of catching a dead broiler achieved by the device is shown in Figure 23.
During the experiment, it was found that there were differences in the parts of dead broilers that were grabbed, as shown in Figure 24. The success rate of the device depends on which body part is grabbed. Three body parts were examined in three groups: (a), (b), and (c). Group (a) showed grabbing by the back, recorded as experiment 1. Group (b) showed grabbing by the back and chest, marked as experiment 2. Group (c) showed grabbing by the chest, marked as experiment 3. After an experiment was finished, the position of the device was kept unchanged; the detailed data on the grabbing results are shown in Table 8.
The mobile device for identifying and grasping dead broilers completed the task of removing dead broilers with a success rate of over 77% and an average success rate of 81.3%. The success rate when grabbing dead broiler feet and other parts with small contact areas was the lowest. The success rate was reduced in areas where broilers were densely gathered, which may have been due to the failure of the device to collect information on dead broilers; however, the mobile device can disperse the broilers during its operation and capture the information required to complete the grabbing of dead broilers.

4. Conclusions

(1)
The experimental results demonstrated that the mobile device developed for identifying and grasping dead broilers proved capable of fulfilling the task of removing dead broilers. It achieved a success rate of over 77%, with an average success rate of 81.3%. However, the success rate was lowest when grasping parts of the broilers such as the neck, feet, or other areas with small contact surfaces. Even though the presence of the mobile device itself exerted a certain influence on the success rate, this aspect will be the focus of future improvements. Additionally, the success rate of grasping deceased broilers decreased in densely populated areas, which can be attributed to the device’s limited ability to collect information on dead broilers. Nevertheless, the mobile device could disperse the broilers during movement, capture the necessary information, and complete the grasping task.
(2)
This study proposes an enhanced deep learning-based approach for identifying broilers. The YOLOv6 algorithm, with its superior comprehensive performance, was selected as the basic network and underwent in-depth optimization. Specifically, a YOLOv6 network structure based on the SE attention mechanism and ASPP was proposed to address the existing issues found in broiler houses. The experimental outcomes indicated that the recognition accuracy of the improved algorithm model reached 86.1%.
(3)
This study designed a mechanical arm for positioning and grasping dead broilers. A model joint simulation of the manipulator was conducted, and the motion trajectory was planned. The experimental results verified that the manipulator model passed the test, the transmission was stable, and the trajectory met the requirements, thereby providing the essential conditions for achieving stable grasping and attaining the design objective.
This study focused on the automatic identification and removal of dead broilers in large-scale flat-breeding yellow-feather broiler farms, aiming to develop a solution that combines vision technology and robotic arm control technology. In order to solve the challenge of dead broiler identification in complex environments, this study proposed a dead broiler detection algorithm with high accuracy, high speed, and easy portability. Compared with traditional machine learning methods, the algorithm achieved significant improvements in accuracy and real-time performance, ensuring that the speed requirements of the dead broiler cleaning process can be met.
In addition, this study independently developed a vision-based mobile device for dead broiler collection that successfully achieved the expected design goals and was capable of efficiently and rapidly identifying and disposing of dead broilers in large-scale free-range broiler farms. The device shows a modularized design, which not only facilitates future function expansion, maintenance, and system upgrades, but also improves the overall flexibility.
Despite these results, there are some limitations of this study, as follows:
(1)
Limitations of applicability—The current study was tested and optimized mainly with respect to yellow-feathered broilers. Given the wide variety of broiler breeds available in the market, further verification of the applicability of the device for other breeds is required. If the recognition effect is found to be poor, a breed-specific image database needs to be established as a benchmark for recognition.
(2)
Efficiency and energy-saving considerations—When there are multiple dead broilers in the coop at the same time, although the device is able to effectively detect and remove them, further research is needed into how to optimize path planning for a more efficient operation from the point of view of improving efficiency and energy use.
Future work will focus on improvements and refinements in both of these areas, in order to further increase the usefulness and adaptability of the system.

Author Contributions

Conceptualization, C.X. and X.Z.; methodology, C.X. and X.Z.; validation, C.X., H.L. and X.Z.; formal analysis, C.X., H.L. and Y.L.; investigation, C.X., H.L., Y.L. and M.W.; data curation, C.X., H.L., Y.L., M.W. and W.L.; writing—original draft preparation, C.X., H.L., Y.L., M.W. and W.L.; writing—review and editing, S.W., W.Z., M.X. and X.Z.; visualization, C.X., H.L. and Y.L.; supervision, S.W. and X.Z.; project administration, S.W., W.Z., M.X. and X.Z.; funding acquisition, S.W. and X.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the International Science and Technology Cooperation Program of Jiangsu Province (Grant No. BZ2023013), Xuzhou Key Research and Development Project (Modern Agriculture) (Grant No. KC21135), and National Entrepreneurship Training Program for University Students (Grant No. 202310307223E).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are contained within the article.

Acknowledgments

We are thankful to Yungang Bai, Sunyuan Wang, and Zhilong Chen, who contributed to our field data collection and primary data analysis.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Zhou, S.; Watcharaanantapong, P.; Yang, X.; Thornton, T.; Gan, H.; Tabler, T.; Zhao, Y. Evaluating broiler welfare and behavior as affected by growth rate and stocking density. Poult. Sci. 2024, 103, 103459. [Google Scholar] [CrossRef]
  2. Li, G.; Zhao, Y.; Porter, Z.; Purswell, J. Automated measurement of broilers under four stocking densities via faster region-based convolutional neural network. Animal 2021, 15, 100059. [Google Scholar] [CrossRef]
  3. Chen, G.; Ling, X.; Xie, M.; Xiong, Y.; Li, T.; Shui, C.; Li, C.; Xu, B.; Ma, F. Systematic evaluation of the meat qualities of free-range chicken (Xuan-Zhou) under different ages explored the optimal slaughter age. Poult. Sci. 2024, 103, 104019. [Google Scholar] [CrossRef]
  4. Park, J.; Kwon, O.; Lee, K.; Heo, Y.; Yoon, C. Ammonia and Hydrogen Sulfide Monitoring in Broiler Barns and Cattle Barns. J. Environ. Health Sci. 2015, 41, 277–288. [Google Scholar] [CrossRef]
  5. Charman, W. Visual standards for driving. Ophthal. Physiol. Opt. 2015, 5, 211–220. [Google Scholar] [CrossRef]
  6. Li, X.; Yan, F.; Hu, K.; He, X.; Ma, Z.; Yuan, Q.; Mirzoev, S.I. Design and test of the integrated machine for self-propelled manure collection and bagging in flat chicken coops. Trans. Chin. Soc. Agric. Eng. 2024, 40, 251–261. [Google Scholar]
  7. Shi, Z.; Xi, L.; Ji, Z.; Cheng, P. LED illuminant improving broilers house environment and growth performance. Trans. Chin. Soc. Agric. Eng. 2017, 33, 222–227. [Google Scholar]
  8. Hinojosa, C.; Caldwell, D.; Byrd, J.; Ross, M.; Stringfellow, K.; Fowlkes, E.; Lee, J.; Stayer, P.; Farnell, Y.; Farnell, M. Use of a foaming disinfectant and cleaner to reduce aerobic bacteria on poultry transport coops. J. Appl. Poult. Res. 2015, 24, 364–370. [Google Scholar] [CrossRef]
  9. Okada, H.; Itoh, T.; Suzuki, K.; Tsukamoto, K. Wireless sensor system or detection of avian influenza outbreak farms at an early stage. In Proceedings of the 8th IEEE Conference on Sensors, Christchurch, New Zealand, 25–28 October 2009; pp. 1374–1377. [Google Scholar]
  10. Aydin, A.; Bahr, C.; Viazzi, S.; Exadaktylos, V.; Buyse, J.; Berckmans, D. A novel method to automatically measure the feed intake of broiler chickens by sound technology. Comput. Electron. Agric. 2008, 101, 17–23. [Google Scholar] [CrossRef]
  11. Cao, Y.; Teng, G.; Yu, L.; Li, Q. Comparison of different de-noising methods in vocalization environment of laying hens including fan noise. Trans. Chin. Soc. Agric. Eng. 2014, 30, 212–218. [Google Scholar]
  12. Jacob, F.; Baracho, M.; Nääs, I.; Souza, R.; Salgado, D. The use of infrared thermography in the identification of pododermatitis in broilers. Eng. Agric. 2016, 36, 253–259. [Google Scholar] [CrossRef]
  13. Wang, J.; Wang, N.; Li, L.; Ren, Z. Real-time behavior detection and judgement of egg breeders based on YOLO v3. Neural Comput. Appl. 2020, 32, 5471–5481. [Google Scholar] [CrossRef]
  14. Zhang, Q. Application and Innovation of New Media Technology in Visual Communication Design. Agro Food Ind. Hi Tech. 2017, 28, 3170–3173. [Google Scholar]
  15. Shafay, M.; Ahmad, R.; Salah, K.; Yaqoob, I.; Jayaraman, R.; Omar, M. Blockchain for deep learning: Review and open challenges. Clust. Comput. 2023, 26, 197–221. [Google Scholar] [CrossRef]
  16. Lu, C. Study on Dead Birds Detection System Based on Machine Vision in Modern Chicken Farm. Master’s Thesis, JiangSu University, Zhenjiang, China, 2009. [Google Scholar]
  17. Zheng, S.; Wang, L. Development of Monitoring System for Layers Rearing in Multi-Tier Vertical Cages Using Machine Vision. J. Jilin Agric. Univ. 2009, 31, 476–480. [Google Scholar]
  18. Qu, Z. Study on Detection Method of Dead Chicken in Unmanned Chicken Farm. Master’s Thesis, Jilin University, Changchun, China, 2009. [Google Scholar]
  19. Veera, V.; Yang, Z. Automatic Identification of Broiler Mortality Using Image Processing Technology. ASABE Meet. Present. 2018, 18, 3–5. [Google Scholar]
  20. Sequeira, J.; Ribeiro, M. Human-robot interaction and robot control. Robot. Motion Control Recent. 2006, 35, 375–390. [Google Scholar]
  21. Liu, H.; Chen, C.; Tsai, Y.; Hsieh, K.; Lin, H. Identifying images of dead chickens with a chicken removal system integrated with adeep learning algorithm. Sensor 2021, 21, 3579. [Google Scholar] [CrossRef]
  22. Hu, Z. Research on Underactuated End Effector of Dead Chicken Picking Robot. Master’s Thesis, HeBei Agricultural University, Baoding, China, 2021. [Google Scholar]
  23. Zhao, W.; Chen, Y.; Cao, T. Design and kinematics analysis of robotic arm used for picking up dead chickens. J. Chin. Agric. Mech. 2023, 44, 131–136. [Google Scholar]
  24. Li, Y.; Liu, Q.; Li, T.; Wu, Y.; Niu, Z.; Hou, J. Design and experiments of garlic bulbil orientation adjustment device using Jetson Nano processor. Trans. Chin. Soc. Agric. Eng. 2021, 37, 35–42. [Google Scholar]
  25. Wang, S.; Hu, Y. Binocular visual positioning under inhomogeneous, transforming and fluctuating media. Trait. Du Signal 2018, 35, 253–276. [Google Scholar] [CrossRef]
  26. Li, X.; Zhang, E.; Fang, X.; Zhai, B. Calibration Method for Industrial Robots Based on the Principle of Perigon Error Close. IEEE Access 2022, 10, 48569–48576. [Google Scholar] [CrossRef]
  27. Yue, W.; Hua, S.; Ying, D. A Complete Analytical Solution to Hand-Eye Calibration Using Quaternions and Eigenvector-Eigenvalue Identity. J. Intell. Robot. Syst. 2023, 109, 54. [Google Scholar]
  28. Shang, Y.; Shen, J.; Wei, W.; Zheng, B. Optimization of Ball Mill Cylinder Structure Based on Response Surface Optimization Module and Multi-objective Genetic Algorithm. J. Mech. Sci. Technol. 2024, 38, 3631–3640. [Google Scholar] [CrossRef]
  29. Feng, X.; Tian, D.; Wu, H. A matrix-solving hand-eye calibration method considering robot kinematic errors. J. Manuf. Process. 2023, 99, 618–635. [Google Scholar] [CrossRef]
  30. Cui, G.; Qiao, L.; Li, Y.; Chen, Z.; Liang, Z.; Xin, C.; Xiao, M.; Zou, X. Division of Cow Production Groups Based on SOLOv2 and Improved CNN-LSTM. Agriculture 2023, 13, 1562. [Google Scholar] [CrossRef]
  31. Lao, F.; Teng, G.; Li, J.; Yu, L.; Li, Z. Behavior recognition method for individual laying hen based on computer vision. Trans Chin. Soc. Agric. Eng. 2018, 28, 157–163. [Google Scholar]
  32. Guo, J.; He, G.; Deng, H.; Fan, W.; Xu, L.; Cao, L.; Feng, D.; Li, J.; Wu, H.; Lv, J.; et al. Pigeon cleaning behavior detection algorithm based on lightweight network. Comput. Electron. Agric. 2022, 199, 107032. [Google Scholar] [CrossRef]
  33. Bist, B.; Subedi, S.; Yang, X.; Chai, L. A Novel YOLOv6 Object Detector for Monitoring Piling Behavior of Cage-Free Laying Hens. AgriEngineering 2023, 5, 905–923. [Google Scholar] [CrossRef]
  34. Xie, W.; Kimura, M.; Takaki, K.; Asada, Y.; Iida, T.; Jia, X. Interpretable Framework of Physics-Guided Neural Network with Attention Mechanism: Simulating Paddy Field Water Temperature Variations. Water Resour. Res. 2022, 58, e2021WR030493. [Google Scholar] [CrossRef]
  35. Yun, Y.; Choi, J.; Chung, H.; Bae, K.; Moon, J. Performance evaluation of an occupant metabolic rate estimation algorithm using activity classification and object detection models. Build. Environ. 2024, 252, 252111299. [Google Scholar] [CrossRef]
  36. Sun, F.; Wang, Y.; Lan, P.; Zhang, X.; Chen, X.; Wang, Z. Identification of apple fruit diseases using improved YOLOv5s and transfer learning. Trans Chin. Soc. Agric. Eng. 2022, 38, 171–179. [Google Scholar]
  37. Corke, P. MATLAB toolboxes: Robotics and vision for students and teachers. IEEE Robot. Autom. Mag. 2007, 14, 16–17. [Google Scholar] [CrossRef]
  38. Vila-Rosado, D.; Domínguez-López, J. A MATLAB toolbox for robotic manipulators. In Proceedings of the Sixth Mexican International Conference on Computer Science, Puebla, Mexico, 26–30 September 2005; pp. 256–263. [Google Scholar]
  39. Liu, Y.; He, Y.; Wu, X.; Wang, W.; Zhang, L.; Lv, H. Potato Sprouting and Surface Damage Detection Method Based on Improved Faster R-CNN. Trans Chin. Soc. Agric. Mach. 2024, 55, 371–378. [Google Scholar]
  40. Biswas, D.; Su, H.; Wang, C.; Stevanovic, A.; Wang, W. An automatic traffic density estimation using Single Shot Detection (SSD) and Mobile Net-SSD. Phys. Chem. Earth 2019, 110, 176–184. [Google Scholar] [CrossRef]
Figure 1. The interior scene of the experimental broiler house.
Figure 1. The interior scene of the experimental broiler house.
Agriculture 14 01614 g001
Figure 2. The structure of the dead broiler identification and grabbing device.
Figure 2. The structure of the dead broiler identification and grabbing device.
Agriculture 14 01614 g002
Figure 3. The image acquisition and processing system.
Figure 3. The image acquisition and processing system.
Agriculture 14 01614 g003
Figure 4. Binocular camera imaging model.
Figure 4. Binocular camera imaging model.
Agriculture 14 01614 g004
Figure 5. The dead broiler grasping manipulator: (a) mechanical arm backbone; (b) end effector.
Figure 5. The dead broiler grasping manipulator: (a) mechanical arm backbone; (b) end effector.
Agriculture 14 01614 g005
Figure 6. Diagram of coordinate system transformation.
Figure 6. Diagram of coordinate system transformation.
Agriculture 14 01614 g006
Figure 7. The checkerboard calibration board. (a) Original image; (b) result.
Figure 7. The checkerboard calibration board. (a) Original image; (b) result.
Agriculture 14 01614 g007
Figure 8. Hand–eye calibration operation.
Figure 8. Hand–eye calibration operation.
Agriculture 14 01614 g008
Figure 9. The transition coordinate system.
Figure 9. The transition coordinate system.
Agriculture 14 01614 g009
Figure 10. Different behavior images of yellow-feather broilers: (a) walking; (b) pecking; (c) resting; (d) inactive; (e) dead.
Figure 10. Different behavior images of yellow-feather broilers: (a) walking; (b) pecking; (c) resting; (d) inactive; (e) dead.
Agriculture 14 01614 g010
Figure 11. The SE module’s structure diagram.
Figure 11. The SE module’s structure diagram.
Agriculture 14 01614 g011
Figure 12. Schematic diagram of the improved YOLOv6 network structure.
Figure 12. Schematic diagram of the improved YOLOv6 network structure.
Agriculture 14 01614 g012
Figure 13. Simulation model of manipulator.
Figure 13. Simulation model of manipulator.
Agriculture 14 01614 g013
Figure 14. The simulation result graph: (a) acceleration simulation results; (b) velocity simulation result; (c) position simulation result graph. P.s.: The blue line represents the movement of joint 1, the red line represents the movement of joint 2, the purple line represents the movement of joint 3, and the orange line represents the movement of joint 4.
Figure 14. The simulation result graph: (a) acceleration simulation results; (b) velocity simulation result; (c) position simulation result graph. P.s.: The blue line represents the movement of joint 1, the red line represents the movement of joint 2, the purple line represents the movement of joint 3, and the orange line represents the movement of joint 4.
Agriculture 14 01614 g014
Figure 15. Speed comparison of YOLOv6, SSD and Faster-RCNN algorithms detection models.
Figure 15. Speed comparison of YOLOv6, SSD and Faster-RCNN algorithms detection models.
Agriculture 14 01614 g015
Figure 16. Recognition result of the YOLOv6 model.
Figure 16. Recognition result of the YOLOv6 model.
Agriculture 14 01614 g016
Figure 17. Comparison of the results of ablation experiments with different labeling categories.
Figure 17. Comparison of the results of ablation experiments with different labeling categories.
Agriculture 14 01614 g017aAgriculture 14 01614 g017b
Figure 18. Speed comparison of YOLOv6, YOLOv6 + SE and YOLOv6 + CBAM algorithms detection models.
Figure 18. Speed comparison of YOLOv6, YOLOv6 + SE and YOLOv6 + CBAM algorithms detection models.
Agriculture 14 01614 g018
Figure 19. Speed comparison of YOLOv6, YOLOv6 + SE and Improved YOLOv6 + SE algorithms detection models.
Figure 19. Speed comparison of YOLOv6, YOLOv6 + SE and Improved YOLOv6 + SE algorithms detection models.
Agriculture 14 01614 g019
Figure 20. Recognition result of the improved YOLOv6 + SE.
Figure 20. Recognition result of the improved YOLOv6 + SE.
Agriculture 14 01614 g020
Figure 21. Training loss curves of the improved algorithms: (a) SSD; (b) Faster-RCNN; (c) improved YOLOv6 + SE.
Figure 21. Training loss curves of the improved algorithms: (a) SSD; (b) Faster-RCNN; (c) improved YOLOv6 + SE.
Agriculture 14 01614 g021
Figure 22. The real-time detection results.
Figure 22. The real-time detection results.
Agriculture 14 01614 g022
Figure 23. (a) Identifying the dead broiler. (b) The device is moving. (c) The device is grabbing the dead broiler. (d) The device is receiving the dead broiler. (e) The device is transporting the dead broiler.
Figure 23. (a) Identifying the dead broiler. (b) The device is moving. (c) The device is grabbing the dead broiler. (d) The device is receiving the dead broiler. (e) The device is transporting the dead broiler.
Agriculture 14 01614 g023
Figure 24. The different parts grasped: (a) back grab; (b) back and chest grab; (c) chest grab.
Figure 24. The different parts grasped: (a) back grab; (b) back and chest grab; (c) chest grab.
Agriculture 14 01614 g024
Table 1. Different behavioral definitions of yellow-feather broilers.
Table 1. Different behavioral definitions of yellow-feather broilers.
Behavior ClassificationClassification Definition
DeadThe yellow-feather broiler’s eyes close, the body clings to the ground, or the body is weak and stiff.
OthersThis includes motions such as walking, pecking, inactivity, and resting.
Table 2. The definitions of categories for classification and the number of observations for each category.
Table 2. The definitions of categories for classification and the number of observations for each category.
Label Name and NumberLabel Definition
Walking (20,864)Actions such as standing, walking, and arranging feathers
Pecking (15,648)Yellow-feathered broilers with their heads touching the ground, troughs or water troughs, or tails cocked up
Resting (10,432)Yellow-feathered broilers lying on the ground
Inactive (5216)Yellow-feathered broilers lying on their backs with their bodies curled up or their tails drooping
Dead (32,166)Yellow-feathered broilers lying flat on the ground with their bodies in a rigid state
Table 3. Comparison of advantages and disadvantages of three algorithms.
Table 3. Comparison of advantages and disadvantages of three algorithms.
ModelsExact Deep Learning MethodProsCons
SSDSingle deep network for both object classification and localization using default boxesFast, simple, and effective for a wide range of object sizesLess accurate on very small objects
Faster-RCNNTwo-stage detector with a region proposal network (RPN) for generating candidate regionsHigh accuracy, flexible backbone networks, faster-than-previous R-CNN versionSlower than one-stage detectors, more complex to train
YOLOv6Single-pass detector with efficient backbones and improved training techniquesVery fast, simple architecture, competitive accuracyLimited information available, may struggle with small objects
Table 4. The overall object detection results of the category of yellow-feathered broilers under three models.
Table 4. The overall object detection results of the category of yellow-feathered broilers under three models.
ModelsPrecisionRecallF1 ScoremaP
YOLOv60.800.810.800.86
SSD0.780.780.780.80
Faster-RCNN0.810.810.810.87
Table 5. The standard errors of Faster-RCNN and YOLOv6.
Table 5. The standard errors of Faster-RCNN and YOLOv6.
ModelsStandard Errors of PrecisionStandard Errors of RecallStandard Errors of F1 Score
YOLOv60.62%0.51%0.68%
Faster-RCNN0.81%0.77%0.73%
Table 6. Comparison of results of ablation experiments for overall categories.
Table 6. Comparison of results of ablation experiments for overall categories.
ModelsPrecisionRecallF1 ScoremaP
YOLOv6 + SE0.840.880.830.90
YOLOv6 + CBAM0.820.860.840.90
Table 7. Comparison of improved and unimproved model results.
Table 7. Comparison of improved and unimproved model results.
ModelsPrecisionRecallF1 ScoremaP
YOLOv6 + SE0.840.880.830.90
Improved YOLOv6 + SE0.860.890.870.92
Table 8. Comparison of improved model results.
Table 8. Comparison of improved model results.
Experimental Serial NumberGrab TimesSuccess Rate
1300.87
2300.80
3300.77
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Xin, C.; Li, H.; Li, Y.; Wang, M.; Lin, W.; Wang, S.; Zhang, W.; Xiao, M.; Zou, X. Research on an Identification and Grasping Device for Dead Yellow-Feather Broilers in Flat Houses Based on Deep Learning. Agriculture 2024, 14, 1614. https://doi.org/10.3390/agriculture14091614

AMA Style

Xin C, Li H, Li Y, Wang M, Lin W, Wang S, Zhang W, Xiao M, Zou X. Research on an Identification and Grasping Device for Dead Yellow-Feather Broilers in Flat Houses Based on Deep Learning. Agriculture. 2024; 14(9):1614. https://doi.org/10.3390/agriculture14091614

Chicago/Turabian Style

Xin, Chengrui, Hengtai Li, Yuhua Li, Meihui Wang, Weihan Lin, Shuchen Wang, Wentian Zhang, Maohua Xiao, and Xiuguo Zou. 2024. "Research on an Identification and Grasping Device for Dead Yellow-Feather Broilers in Flat Houses Based on Deep Learning" Agriculture 14, no. 9: 1614. https://doi.org/10.3390/agriculture14091614

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop