Next Article in Journal
Intelligent Scene-Adaptive Desensitization: A Machine Learning Approach for Dynamic Data Privacy in Virtual Power Plants
Previous Article in Journal
Optimizing CNN Hardware Acceleration with Configurable Vector Units and Feature Layout Strategies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research and Design of a Chicken Wing Testing and Weight Grading Device

1
College of Mechanical & Electronic Engineering, Shandong Agricultural University, Taian 271018, China
2
College of Information Science & Engineering, Shandong Agricultural University, Taian 271018, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Electronics 2024, 13(6), 1049; https://doi.org/10.3390/electronics13061049
Submission received: 27 January 2024 / Revised: 27 February 2024 / Accepted: 7 March 2024 / Published: 12 March 2024

Abstract

:
This thesis introduces a nondestructive inspection and weight grading device for chicken wings to replace the traditional manual grading operation. A two-sided quality nondestructive inspection model of chicken wings based on the YOLO v7-tiny target detection algorithm is designed and deployed in a Jetson Xavier NX embedded platform. An STM32 microcontroller is used as the main control platform, and a wing turning device adapting to the conveyor belt speed, dynamic weighing, and a high-efficiency intelligent grading unit are developed, and the prototype is optimized and verified in experiments. Experiments show that the device can grade four chicken wings per second, with a comprehensive accuracy rate of 98.4%, which is better than the traditional grading methods in terms of efficiency and accuracy.

1. Introduction

Chicken wings play an essential role in China’s meat consumer products. However, problems such as deterioration, breakage, and feather residue inevitably occur during the pre-processing, transportation, and storage of chicken wings, and it is usually necessary to grade chicken wings according to their weight or size in post-processing to meet different consumer demands [1]. China’s related industries still rely heavily on manual quality inspection and grading work, with low accuracy, slow efficiency, and a lack of large-scale assembly line production devices [2]. In developed countries, there are a few companies like the Icelandic Marel company and the Netherlands Meyn company that have proposed automated meat inspection solutions, but its high cost, complex operation is not applicable to China’s common small and medium-sized processing plants, so there is currently a large market gap in this field in China [3,4].
To address the above problems, this group has developed two generations of chicken wing nondestructive testing and grading devices since 2016 [5]. The first-generation prototype is based on MATLAB 2022b software and realizes the chicken wing recognition and detection algorithm and operates the robotic arm to complete the grading operation with a 51 microcontroller as the control platform, and its 3D structure is shown in Figure 1a. However, the first-generation platform fails to realize accurate conveyor speed control, and the servo response delay is significant, resulting in poor accuracy when the device is running at high speed [6]. The group developed a second-generation prototype based on the first-generation prototype. The second-generation prototype is based on the HSV (Hue, Saturation, Value) color space model that counts the color cumulative histograms of the chicken wing images to achieve nondestructive detection of stasis-blooded chicken wings, and upgraded the control platform to STM32, whose 3D structure is shown in Figure 1b [7]. However, the problem is that the structure of the machine vision detection part is more complicated, while the efficiency of the robotic arm is difficult to improve due to the influence of the operation mode. The third-generation prototype introduced in this paper is based on the previous two generations of products, and the grading algorithm, control system, actuator unit, etc., have been upgraded and improved.

2. Design of the General Structure

Figure 2 shows the structure of the third-generation chicken wing nondestructive testing and weight grading device. The whole machine is designed as an “L-type” arrangement, which has the advantages of higher space utilization and easier mobility than the one-linear arrangement. The overall structure of the device contains a conveyor module, detection unit, turning mechanism, weighing equipment, execution unit, and storage device, which realizes nondestructive testing and grading process as follows: first, the chicken wings in the conveyor belt are transported to the visual inspection area of conveyor belt 1, where the front side of the chicken wings is photographed, and the appearance quality is inspected by the CMOS ②. Subsequently, the chicken wings are dropped into the V-shaped groove of the turning device ⑪, which flips the chicken wings over and then enters into conveyor belt 2. Secondly, after the chicken wings are flipped, the CMOS ③ completes the detection of the reverse side. If the result is unqualified, the robotic arm pushes the chicken wings into the residual product storage box ④. If it is qualified, it enters the weight sensor ⑤ (conveyor belt 3) for weighting and then enters into conveyor 4 for grading. When the chicken wings move to the photoelectric sensor ⑥ of the corresponding level, the stepping motor drives the sliding table ⑦ and the pusher bar ⑨ to push the chicken wings into the corresponding storage box ⑧. The Jetson Xavier NX [8] development board responsible for running the deep learning detection model and the central control platform, the Stm32F407 MCU, are in the control cabinet ⑫ and the MCGS TPX7062Ti touch screen is in box ① as the human-computer interaction terminal. The flow chart is shown in Figure 3.

3. Testing Model of Chicken Wings

3.1. Acquisition and Processing of Datasets

The main objective of the study is to detect and eliminate the chicken wings with bruises, damages, feather residue, etc., caused by diseases or pre-processing. The chicken wings collected in this study were white-feathered broiler chickens produced in Changqing, Jinan, Shandong, China, with a 45-day slaughter time. All wings were purchased at the farmer’s market on the day of the shoot and were de-feathered and cleaned. Hikvision CMOS model MV-CA060-10GC [9] captured the images with a resolution of 3072 × 2048. There are two CMOS mounted on the conveyor belt that take charge of the front and back images. In this study, 2082 original chicken wing images were collected, and the total number of datasets was expanded to 8328 after flipping, scaling, changing brightness, and adding Gaussian noise to the images by using OpenCV. The sample dataset is shown in Figure 4, and an example of the dataset enhancement effect is shown in Figure 5. The calibration of the dataset was carried out using “labelimg” software, and the labels were categorized as ‘normal’, and ‘sick’ to represent qualified and unqualified chicken wings, respectively. The muscle grading standards used in this paper are the Chinese National Standard “Livestock and Poultry Meat Quality Grading-Chicken Meat” (GB/T 19676-2022) [10].

3.2. YOLO v7-Tiny Network

Considering that the model needs to be deployed on embedded platforms, the lightweight YOLO v7-tiny model is used as the base algorithm for improvement. YOLO v7-tiny is a simplified version based on YOLO v7, and its overall network structure is basically the same as that of YOLO v7, with the main difference being that the ELAN structure in the backbone network is reduced by one connecting branch compared to that of YOLO v7 and consists of a multiple Max Pooling layer instead of the feature pyramid [11]. It does not use re-referential convolution and the LeakyReLU activation function in the head of the network. It significantly reduces the model size and computational complexity while maintaining high detection performance. The YOLO v7-tiny network structure is shown in Figure 6 [12].

3.3. Improvements of YOLO v7-Tiny Network

3.3.1. Adding Attention Mechanisms

To further improve the model’s ability to capture chicken wing features, this study introduces the CoordAtt (Coordinate Attention, CA) attention mechanism in the YOLO v7-tiny backbone network; the output and the location are shown in Figure 7 [13]. CoordAtt improves the attention module’s perception of spatial location information by introducing coordinate information in the spatial dimension, and the advantage is particularly obvious when processing visual tasks with high-resolution feature maps; its encoding process is shown in Figure 8.
CA Attention for a feature xc(i,j) of height i and width j on the channel c of input X; the output after the coordinate attention module can be expressed as Formula (1).
y c ( i , j ) = x c ( i , j ) × g c h ( i ) × g c w ( j )
In the formula, yc(i,j) is the output after embedding the coordinate attention module; gch(i) is the attention weights in the horizontal direction on channel c; gcw(j) is the attention weights in the vertical direction on channel c. To compare the optimization effects of different attention mechanisms on the model, two other commonly used attention mechanisms, CBAM and SENet, are introduced here for comparison. The positions they add are the same as the CA attention mechanism. Their structures are shown in Figure 9 and Figure 10. The results are shown in Table 1.
From the data in the table above, it can be seen that the CA attention mechanism has the most significant improvement in the evaluation index of the model.

3.3.2. Replacement of the Default Loss Function

The YOLO v7 series follows the CIoU Loss function of YOLO v5, which is derived by improving on DIoU. However, although CIoU introduces a penalty term in IoU Loss to alleviate the problem of vanishing gradient, it uses the relative proportion of width and height. Once the detection box is linearly proportional to the Ground Truth (GT) box, the penalty term no longer works, and if their directions do not match, it will also lead to lower efficiency. The schematic diagram is shown in Figure 11.
In the figure, b, b(gt) are the center points of the Ground Truth box of the detection box, ρ is the distance between these two points, and C is the diagonal distance of the smallest closure region that can contain both the detection box and the GT box. To address the above problem, the Focal-EIoU loss function is introduced, which consists of EIoU Loss and FocalL1 Loss, where EIoU Loss directly uses the value of the edge lengths as a penalty term, while LocalL1 Loss can be suppressed by giving smaller gradients to low-quality samples. By integrating the two, the final Focal-EIoU Loss is obtained [14], which can be expressed as Equations (2) and (3).
L F o c a l   E I o U = I o U γ L E I o U
L E I o U = L I o U + L d i s + L a s p = 1 I O U + ρ 2 ( b , b g t ) c 2 + ρ 2 ( ω , ω g t ) c ω 2 + ρ 2 ( h , h g t ) c h 2
In these formulas, γ is a hyperparameter used to control the curve, taken as 0.5; IoU is the overlap between the true and predicted frames; LIoU is the IoU loss; Ldis is the distance loss; and Lasp is the edge length loss; b, bgt are the center points of the GT box of the prediction box; ω , ω g t is the width of the detection box and the GT box; h , h g t is the height of the detection box and the GT box; ρ   is the distance of center points of the GT box of the detection box; c ω and c h   are the width and height of the smallest enclosing box covering those two boxes.
We compared the CIoU, GIoU, DIoU supported by YOLO v7 by default with Focal-EIoU (the CA attention mechanism has been added based on the experimental results in Section 3.3.1), and the results are shown in Table 2.

3.4. Improvements of YOLO v7-Tiny Networkvem

The CA attention mechanism and the Focal-EIoU loss function are added to the YOLO v7-tiny model, and the improved model is named YOLO v7-tiny-CF. Using the improved model and the original YOLO v7-tiny for 300 trainings, respectively, the training results are shown in Figure 12. The comparison graph of the detection results is shown in Figure 13. The experimental environment is the Windows 11 Professional OS, the CPU is Intel i5-13500HX, and the GPU is the NVIDIA RTX 4060 Laptop. The system runs on 16 GB of RAM. The version of the GPU acceleration library is CUDA 11.3, the runtime environment is Pytroch 1.12.0, and the language used is Python. The experiment was conducted in December 2023 in Taian City, Shandong Province, China. The study mainly used accuracy (Precision, P), recall (Recall, R), and mAP (@0.5) to evaluate the model training results [15]. The number of training times for all models was 300, and the dataset was divided into training set, testing set, and validation set in the ratio of 8:1:1. The batch_size was set to four during training, and the initial learning rate was set to 0.01, 0.001 after 100 times, and 0.0001 after 200 times.
From the training result curves, it is evident that both the original model and the improved model achieve very high accuracy, recall rate, and mAP after 300 training epochs. However, the YOLO v7-tiny-CF model outperforms the YOLO v7-tiny model in terms of the fluctuation range of the curve during the initial rise as well as the stable values after 200 rounds. The val objectness of the improved model also decreases more steadily than that of the original model, with a lower value. Similarly, the P–R (e) and F1 curves (f) of the improved model outperform the original model. As can be seen from Figure 13a,b, the original model experienced false positives and false negatives, whereas the improved model used to detect the same images in Figure 13c,d does not have those problems. Based on the above analysis, it can be inferred that the improvement route for the YOLO v7-tiny algorithm is fundamentally correct. There is a comparison of the evaluation indexes of the improved model, the original model, and other target detection algorithms, such as YOLO v5s, YOLO v7, and Faster-RCNN, which is shown in Table 3.
As seen in the above table, the improved YOLO v7-tiny model achieves essentially the same detection performance with only about 15% of the model size, 17% of the number of parameters, and 29% of the GPU-MEM usage of the full version of YOLO v7, and the detection speed is nearly 64% higher. In addition, the improved model improves the accuracy, recall, and mAP by 0.8%, 0.5%, and 0.7% compared to the original YOLO v7-tiny model with almost the same computational resources, while YOLO v7-tiny-CF has a better and more significant advantage over YOLO v5 and Faster-RCNN models in terms of both resource consumption and performance.
In training and testing, the theoretical arithmetic of the NVIDIA RTX 4060 Laptop GPU we used is about 15 TFLOPS, while the theoretical arithmetic of the Jetson Xavier NX platform is about 6 TFLOPS, a difference of about 40%. Additionally, we used a CMOS maximum frame rate of 17 fps [16,17]. If we do a rough calculation based on the theoretical performance, only the YOLO v7-tiny/YOLO v7-tiny-CF (23.6 FPS) and YOLO v5s (16.8 FPS) models in Table 1 can utilize the performance of the CMOS and meet the requirements of general pipeline detection (more than 15FPS) after porting them to the Jetson Xavier NX platform 18. Therefore, based on the double consideration of performance and load, we finally chose the best “cost-effective” YOLO v7-tiny-CF model for porting.
To port the models generated from the YOLO training to the Jetson Xavier NX platform, the “.pt” model files need to be converted to “ONNX” format and then to “Tensor RT” format in order to prepare for the next step of experiments [18].

4. Design of Key Components

4.1. Design of the Conveyor Belt and Tipping Mechanism

The whole system has four independent conveyor belts, which are responsible for detecting the front and back sides, weighing and grading chicken wings, which are arranged in L-shape and connected by aviation plugs. Each conveyor has the same basic structure and is driven by stepper motors through a transmission chain to drive the rollers. Figure 14 shows the conveyor belt responsible for the front side inspection and the flip device at the end. The chicken wings that have completed the front side detection need to be flipped through this device to carry out the reverse side, and the flipper is also driven by the stepper motor. The flipper is fixed to the U-shape bracket by the spindle using aluminum alloy bearing mounts, where the bearings are responsible for carrying the weight of the flipper and the chicken wings, and the housing is used to fix the bearings in place; the design adopts a bearing assembly that integrates the bearings and the housing in one unit.
Combining the actual length, diameter, and thickness of the chicken wings, the width, depth, and opening angle of the V-groove of the flipper were determined to be 4.5 cm, 10 cm, and 75°. In addition, in order to match the rate of the flipper with the rate of the previous stage of the fixed-pitch conveyor unit, the angular speed of the flipper was calculated according to the following formula:
{ T d = 2 π ω T c = S c V c
In this formula, Td is the period of the circular motion of the flipper (s); ω   is the angular velocity of the circular motion of the flipper (rad/s); Tc is the fixed-pitch conveyor unit movement time (s); Sc is the distance traveled by a fixed-distance transport unit in Tc (m); Vc fixed-pitch conveyor unit movement speed (m/s). The distance between two neighboring conveyor units is set at 10 cm, and the angular speed of the flipper is set at two rad/s, assuming a conveyor speed of 0.6 m/s. Therefore, the V-grooves are designed to be 6 in number, which can satisfy the requirement of flipping 3–6 chicken wings per second.

4.2. Design of Weighing Unit

After removing the unqualified chicken wings, the chicken wings can be graded according to their weight, so the weighing unit is arranged between the quality inspection and grading unit, and a separate conveyor belt is set up to complete the weighing operation. In this study, four PW6D single-point load cells manufactured by HBM (Darmstadt, Germany) with a sensitivity of 2.0 ± 0.2 mV/V and an ADS1256 high-precision data acquisition card were used. The load cell converts the load signal into a voltage signal, and after amplification, filtering, and other operations, the AD converter module is used to input the output signal to the corresponding pins of the STM32 MCU to complete the acquisition of weight information. In the filtering operation, each time the chicken wing data is collected, the limiting process is first performed, and then the obtained data is subjected to sliding mean filtering. The design of the weighing unit and the weight sensor is shown in Figure 15.

4.3. Design of Grading Unit

The final actuator of the weight grading unit has a very important role in the whole system; its task is to grade the chicken wings according to their weight after quality inspection, turning, weighing, and other operations. According to the industry standard, chicken wings are generally divided into L, M, and S grades (corresponding to the weight of ≥41 g, 31–40 g, <30 g), so the design of the conveyor belt on both sides of the installation has three storage boxes [19]. Each group of storage boxes is set up in front of the two photoelectric sensors; when the sensor detects the corresponding grade of chicken wings, the MCU will send signals to the stepper motor and manipulate the pusher; the chicken wings will then be pushed into the corresponding storage box. When the chicken wings in the storage box reach the preset weight, the bottom flap of the storage box can be opened by the motorized actuator for the next operation. The design of the weight grading unit and the storage box is shown in Figure 16. The design of the weight grading unit is shown in Figure 16.

4.4. Design of the Control System

The main control platform of the system is the STM32F407VET6 microcontroller, and its minimum system includes the main chip, power supply module, clock loop, reset loop, etc. The STM32F407VET6 chip uses a 32-bit Cortex-M4 core, and it has a floating-point operation unit, 100 pins, and 192 KB of SRAM; the microcontroller minimum system is shown in Figure 17 [20].
The main control platform is responsible for receiving and processing the signals sent by each information acquisition module and sending commands to the actuator according to the preset program or manual operation; the structure of the control system is shown in Figure 18.
Since the detection link, turning mechanism, weighing link and sorting link of the device are driven by different stepping motors, the acquisition and matching of the speed of each unit is the key to the control system [21]. The system uses an Omron E6B2-CWZ6C incremental rotary encoder for speed acquisition, and the STM32 microcontroller through the TLP2745 high-speed optocoupler to collect the encoder output pulses to calculate the real-time speed of each mechanism. At the same time, the STM32 microcontroller can calculate the difference between the theoretical speed and the actual speed as the input of the PID controller according to the mathematical model and the PWM wave as the output of the PID controller to control the stepper motor. The stepper motor drive circuit is shown in Figure 19.

4.5. Design of UI

In this study, the MCGS TPX7062Ti touch screen is used as the main human–computer interaction device, and its communication with the STM32 microcontroller is realized through RS485—TTL signals, and the data follows the Modbus RTU protocol. The MCGS touch screen can realize the functions of the system’s start/stop, quantity display, fault indication, status monitoring, etc. Its working interface is shown in Figure 20.

5. Experimental Results and Analysis

After the preliminary design, analysis, and test, according to the actual demand and technical route, the group manufactured the third-generation prototype of the third-generation deep learning-based chicken wing quality inspection and weight grading equipment, prototype as shown in Figure 21.

5.1. Chicken Wing Quality Inspection Experiment

After deploying YOLO v7-tiny-CF mentioned in Section 3, the testbed was built as shown in Figure 22. In this experiment, for the convenience of data acquisition, the same CMOS was used to capture images of both the front and back sides of the chicken wings for detection. There are 998 white feathered chicken wings selected as the research object, which contained 162 unqualified products due to bruises, mutilations, feather residues, etc., which were mixed and tested in groups of 200 wings each (the last group was 198), and the experimental results are shown in Table 4.
In this table, TP represents qualified wings tested as qualified wings; TN represents unqualified wings tested as unqualified wings. FP represents unqualified wings tested as qualified wings; FN represents qualified wings tested as unqualified wings. The average accuracy of the above five groups of experiments is 97.0%, and the results show that the nondestructive quality inspection model based on YOLOv7 tiny-CF well solves the problem of nondestructive inspection of chicken wings, which can meet the actual production needs.

5.2. Chicken Wing Grading Experiment

After rejecting the unqualified chicken wings, the remaining 836 qualified chicken wings were used for weight-grading experiments. Among them, there were 196 wings of L-grade (≥41 g), 453 wings of M-grade (31–40 g), and 187 wings of S-grade (<30 g). In the experiment, only chicken wings of the same weight class were tested at the same time, and the test was repeated three times to take the average of the accuracy, and the speed was set at four pcs/s. The design of the experimental process is shown in Figure 23, and the results are shown in Table 5 [22].
As can be seen from the data in the table, the overall accuracy of the device for grading different grades of chicken wings is above 98%. During the experiment, the most crucial reason for the failure of grading was that the speed of the pusher did not accurately match the speed of the conveyor belt, which accounted for 59% of the total failure cases, while another 33% was due to the weighing error being out of the range. The remaining 8% was due to the pusher being jammed when traveling on the guide rail, the photoelectric sensors not receiving the signal, and so on. From the results, the overall success rate of chicken wing grading is high. The efficiency is significantly higher than that of manual weighing (2000 to 2200 pcs/h), which indicates that the technical route and design ideas are basically correct and feasible. However, there is still a certain amount of room for optimization of control algorithms, response speed, and other aspects [23].

6. Conclusions

  • This study developed a third-generation prototype of chicken wing quality detection and weight grading device based on the first two generations of product of the group. Combined with the previous experiments and market demand, the structural layout, detection algorithms, flipper device, weighing unit, grading unit, and so on have been re-designed.
  • The improved quality inspection model based on YOLO v7-tiny was deployed in Jetson Xavier NX, which achieved an accuracy of no less than 96% in the experiments and successfully rejected most of the substandard products. In the experiment of grading chicken wings based on their weight, the comprehensive accuracy rate of the device was above 98% and achieved an operational efficiency much higher than that of manual sorting.
  • It can be shown by the experimental results that the development of the device is generally successful, but there is still some room for improvement in the algorithm, control, and mechanical reliability. The experience can lay the foundation for the updating and iteration of the equipment or algorithm.

Author Contributions

Conceptualization, K.W. and Z.L. (Zhiyong Li); Software, C.W. and B.G.; Methodology, J.L.; Writing—review and editing, X.D., K.W. and Z.L. (Zhengchao Lv). All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Shandong Province Key R&D Project (No. 2015GGH311001), via the Shandong Provincial Department of Education.

Institutional Review Board Statement

Ethical approval was not applicable for this study. In our study, we designed a non-destructive testing and weight grading device for chicken wings. The purpose of our experiments was to improve food processing efficiency and quality control, not to study animal behaviour or physiology. Our experiments did not involve the direct use of any animals or additional harm to any animals.

Data Availability Statement

The data presented in this study are available in this article.

Acknowledgments

The authors would like to extend their appreciation to Shandong Agricultural University, China, for supporting the project possible.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Li, J.; Xie, B.; Zhai, Z.; Zhang, P.; Hou, S.T. Research progress of intelligent equipment and technology for livestock and poultry slaughtering and processing. Food Mach. 2021, 37, 226–232. [Google Scholar]
  2. Gao, G. Development prospects of China’s meat industry in 2022. Meat Ind. 2022, 2, 1–5, (In Chinese with English Abstract). [Google Scholar]
  3. Marel. Product-IRIS NT. Available online: https://marel.com/en/products/iris-nt/2020-2-21 (accessed on 22 January 2024).
  4. Amraei, S.; Mehdizadeh, S.A.; Sallary, S. Application of computer vision and support vector regression for weight prediction of live broiler chicken. Eng. Agric. Environ. Food 2017, 10, 266–271. [Google Scholar] [CrossRef]
  5. Zhao, L.; Xu, J.; Wang, C.; Ding, X.; Li, F.; Hou, F. Research and Design of an Automatic Grading Device in Chicken Wing Weight. Wirel. Pers. Commun. 2018, 102, 769–782. [Google Scholar] [CrossRef]
  6. Lv, Z. Research on Quality Detection and Weight Grading of Chicken Wings Based on Deep Learning; Shandong Agricultural University: Taian, China, 2022. [Google Scholar]
  7. Xu, J. Design of Intelligent Equipment for Quality Detection and Weight Grading of Chicken Wings; Shandong Agricultural University: Taian, China, 2016. [Google Scholar]
  8. Nvidia. Jetson Xavier NX Series System-on-Module Data Sheet. Available online: https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-xavier-series2024-01-10 (accessed on 24 January 2024).
  9. Hikvision. Hikvision Machine Vision Products Catalog MV-CA060-10GC: acA3088-16gc. Available online: https://www.hikvisionweb.com/product/camera/gige/mv-ca060-10gc (accessed on 24 January 2024).
  10. GB/T 19676-2022; Livestock and Poultry Meat Quality Grading-Chicken Meat. Ministry of Agriculture and Rural Affairs of China: Beijing, China, 2022.
  11. Cheng, P.; Tang, X.; Liang, W.; Li, Y.; Cong, W.; Zang, C. Tiny-YOLOv7: Tiny Object Detection Model for Drone Imagery. In Proceedings of the International Conference on Image and Graphics, Nanjing, China, 22–24 September 2023; Springer Nature: Cham, Switzerland, 2023; pp. 53–65. [Google Scholar]
  12. Wang, Z.; Zhang, G.; Luan, K.; Yi, C.; Li, M. Image-Fused-Guided Underwater Object Detection Model Based on Improved YOLOv7. Electronics 2023, 12, 4064. [Google Scholar] [CrossRef]
  13. Wang, J. Research Design of Nondestructive Testing and Weight Grading Device for Chicken Wings Based on Improved YOLOv5s; Shandong Agricultural University: Taian, China, 2023. [Google Scholar]
  14. Zhang, Y.F.; Ren, W.; Zhang, Z.; Jia, Z.; Wang, L.; Tan, T. Focal and efficient IOU loss for accurate bounding box regression. Neurocomputing 2022, 506, 146–157. [Google Scholar] [CrossRef]
  15. Zhou, S.; Cai, K.; Feng, Y.; Tang, X.; Pang, H.; He, J.; Shi, X. An Accurate Detection Model of Takifugu rubripes Using an Improved YOLO-V7 Network. J. Mar. Sci. Eng. 2023, 11, 1051. [Google Scholar] [CrossRef]
  16. Choe, C.; Choe, M.; Jung, S. Run Your 3D Object Detector on NVIDIA Jetson Platforms: A Benchmark Analysis. Sensors 2023, 23, 4005. [Google Scholar] [CrossRef] [PubMed]
  17. Huang, H. Research and Application of YOLO-Based PCB Surface Defect Detection Algorithm; Chongqing University of Technology: Chongqing, China, 2023. [Google Scholar]
  18. Jeon, J.; Jung, S.; Lee, E.; Choi, D.; Myung, H. Run your visual-inertial odometry on NVIDIA Jetson: Benchmark tests on a micro aerial vehicle. IEEE Robot. Autom. Lett. 2021, 6, 5332–5339. [Google Scholar] [CrossRef]
  19. Jiang, H.; Chen, W.; Jia, Z.; Tao, F. Physiochemical properties of short-term frying oil for chicken wing and its oxidative stability in an oil-in-water emulsion. Food Sci. Nutr. 2020, 8, 668–674. [Google Scholar] [CrossRef] [PubMed]
  20. Li, T.; Luan, F.; Wang, M.; Song, Q.; Shi, Z. Design of Remote Monitoring System Based on STM32F407 Microcontroller. In Proceedings of the 2019 IEEE International Conference on Power, Intelligent Computing and Systems (ICPICS), Shenyang, China, 12–14 July 2019; pp. 304–307. [Google Scholar]
  21. Baigvand, M.; Banakar, A.; Minaei, S.; Khodaei, J.; Behroozi-Khazaei, N. Machine vision system for grading of dried figs. Comput. Electron. Agric. 2015, 119, 158–165. [Google Scholar] [CrossRef]
  22. Zhang, Z.S.; Cui, J.; Yang, Q.; Ren, S.M.; Sun, B. The Roller Speed Control of High Voltage Electrostatic Separator Based on PLC. AMR 2014, 1037, 240–243. [Google Scholar] [CrossRef]
  23. Zhao, Y. Research on Quality Detection and Weight Grading of Chicken Wings Based on Machine Vision; Shandong Agricultural University: Taian, China, 2020. [Google Scholar]
Figure 1. (a) The first-generation prototype diagram; (b) The second-generation prototype diagram.
Figure 1. (a) The first-generation prototype diagram; (b) The second-generation prototype diagram.
Electronics 13 01049 g001
Figure 2. The 3D graph of the machine. ① MCGS screen (Back); ② CMOS camera 1; ③ CMOS camera 2; ④ unqualified product storage box; ⑤ weight sensor; ⑥ photoelectric sensors; ⑦ sliding railway; ⑧ storage box; ⑨ pusher bar; ⑩ stepper motor; ⑪ turning device; and ⑫ control cabinet.
Figure 2. The 3D graph of the machine. ① MCGS screen (Back); ② CMOS camera 1; ③ CMOS camera 2; ④ unqualified product storage box; ⑤ weight sensor; ⑥ photoelectric sensors; ⑦ sliding railway; ⑧ storage box; ⑨ pusher bar; ⑩ stepper motor; ⑪ turning device; and ⑫ control cabinet.
Electronics 13 01049 g002
Figure 3. Equipment workflow chart.
Figure 3. Equipment workflow chart.
Electronics 13 01049 g003
Figure 4. A 3D graph of the machine.
Figure 4. A 3D graph of the machine.
Electronics 13 01049 g004
Figure 5. Sample of the dataset enhancement effect.
Figure 5. Sample of the dataset enhancement effect.
Electronics 13 01049 g005
Figure 6. YOLO v7-tiny network structure.
Figure 6. YOLO v7-tiny network structure.
Electronics 13 01049 g006
Figure 7. Position of the attention model in the network.
Figure 7. Position of the attention model in the network.
Electronics 13 01049 g007
Figure 8. The CA attention mechanism module diagram.
Figure 8. The CA attention mechanism module diagram.
Electronics 13 01049 g008
Figure 9. The SENet structure diagram.
Figure 9. The SENet structure diagram.
Electronics 13 01049 g009
Figure 10. The CBAM structure diagram.
Figure 10. The CBAM structure diagram.
Electronics 13 01049 g010
Figure 11. The CIoU structure diagram. The green box is the GT box and the blue box is the prediction box.
Figure 11. The CIoU structure diagram. The green box is the GT box and the blue box is the prediction box.
Electronics 13 01049 g011
Figure 12. Comparison of training result performance between the improved model and the original model.
Figure 12. Comparison of training result performance between the improved model and the original model.
Electronics 13 01049 g012
Figure 13. Comparison of the detection performance between the improved model and the original model.
Figure 13. Comparison of the detection performance between the improved model and the original model.
Electronics 13 01049 g013
Figure 14. Design of flip device and Conveyor. ① encoder; ② coupling; ③ U-bracket; ④ flip device; ⑤ stepper motor; ⑥ foundation; ⑦ drive chain; ⑧ axis of rotation; and ⑨ conveyor. The red line is the outline of the V-shaped groove.
Figure 14. Design of flip device and Conveyor. ① encoder; ② coupling; ③ U-bracket; ④ flip device; ⑤ stepper motor; ⑥ foundation; ⑦ drive chain; ⑧ axis of rotation; and ⑨ conveyor. The red line is the outline of the V-shaped groove.
Electronics 13 01049 g014
Figure 15. Design of weighing unit and gravity sensor—① photoelectric sensor; ② stepper motor; ③ weight sensor; ④ data acquisition unit; ⑤ fame; ⑥ conveyor.
Figure 15. Design of weighing unit and gravity sensor—① photoelectric sensor; ② stepper motor; ③ weight sensor; ④ data acquisition unit; ⑤ fame; ⑥ conveyor.
Electronics 13 01049 g015
Figure 16. Design of grading unit and storage box—① conveyor; ② push bar; ③ photoelectric sensor; ④ sliding railway; ⑤ stepper motor-1; ⑥ conveyor; ⑦ stepper motor-1.
Figure 16. Design of grading unit and storage box—① conveyor; ② push bar; ③ photoelectric sensor; ④ sliding railway; ⑤ stepper motor-1; ⑥ conveyor; ⑦ stepper motor-1.
Electronics 13 01049 g016
Figure 17. The STM32F407VET6 minimum system.
Figure 17. The STM32F407VET6 minimum system.
Electronics 13 01049 g017
Figure 18. Structure of the control system.
Figure 18. Structure of the control system.
Electronics 13 01049 g018
Figure 19. Stepping motor driving circuit.
Figure 19. Stepping motor driving circuit.
Electronics 13 01049 g019
Figure 20. The UI of MCGS.
Figure 20. The UI of MCGS.
Electronics 13 01049 g020
Figure 21. Prototype of third-generation chicken wing quality detection and grading device.
Figure 21. Prototype of third-generation chicken wing quality detection and grading device.
Electronics 13 01049 g021
Figure 22. The Jetson Xavier NX experimental platform.
Figure 22. The Jetson Xavier NX experimental platform.
Electronics 13 01049 g022
Figure 23. Flow diagram of the grading experiment.
Figure 23. Flow diagram of the grading experiment.
Electronics 13 01049 g023
Table 1. Comparison of attention mechanism evaluation indicators.
Table 1. Comparison of attention mechanism evaluation indicators.
ModelPrecisionRecallmAP
YOLO v7-tiny98.6%98.9%98.5%
YOLO v7-tiny-CA98.9%99.1%98.8%
YOLO v7-tiny-SE98.2%98.8%98.2%
YOLO v7-tiny-CBAM98.9%99.0%98.5%
Table 2. Comparison of loss function evaluation indicators.
Table 2. Comparison of loss function evaluation indicators.
ModelPrecisionRecallmAP
YOLO v7-tiny-CIoU98.9%99.1%98.8%
YOLO v7-tiny-GIoU97.3%98.1%97.2%
YOLO v7-tiny-DIoU98.2%98.8%98.2%
YOLO v7-tiny-Focal-EIoU99.4%99.4%99.2%
Table 3. Comparison of evaluation indexes of each model.
Table 3. Comparison of evaluation indexes of each model.
ModelsPRmAPF1 ScoreParameters
Count
GPU-MEM
(Gb)
GFLOPSize (Mb)Speed
(FPS/s)
Faster-RCNN92.5%91.4%91.9%91.94.16 × 1075.21248.4125.130
YOLO v799.6%99.5%99.2%99.53.71 × 1074.69109.571.336
YOLO v5s95.3%96.8%96.6%96.07.05 × 1062.4916.314.342
YOLO v7-tiny98.6%98.9%98.5%98.75.47 × 1061.3811.912.359
YOLO v7-tiny-CF99.4%99.4%99.2%99.45.48 × 1061.3811.911.259
Table 4. Test results of chicken wing quality.
Table 4. Test results of chicken wing quality.
Group.QualifiedUnqualifiedTPTNFPFNPrecisionRecallFI Score
No.115446152424297.4%98.7%98.0
No.217327169261699.4%96.6%98.0
No.316634164333298.2%98.9%98.5
No.4178221742204100%97.8%98.9
No.516533160273598.1%97.0%97.5
Note: Among all 998 chicken wings, there were 4 cases of two different detection frames appearing at the same time (similar to what appeared in Figure 13a, which occurred in Groups No.2 and No.3), and 3 cases of missed detections (which occurred in Group No.5).
Table 5. Grading experiment results.
Table 5. Grading experiment results.
GroupCorrectly Graded QuantityAccuracy
1st2nd3rd
L19319319498.6%
M45144944899.0%
S18218618598.6%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, K.; Li, Z.; Wang, C.; Guo, B.; Li, J.; Lv, Z.; Ding, X. Research and Design of a Chicken Wing Testing and Weight Grading Device. Electronics 2024, 13, 1049. https://doi.org/10.3390/electronics13061049

AMA Style

Wang K, Li Z, Wang C, Guo B, Li J, Lv Z, Ding X. Research and Design of a Chicken Wing Testing and Weight Grading Device. Electronics. 2024; 13(6):1049. https://doi.org/10.3390/electronics13061049

Chicago/Turabian Style

Wang, Kelin, Zhiyong Li, Chengyi Wang, Bing Guo, Juntai Li, Zhengchao Lv, and Xiaoling Ding. 2024. "Research and Design of a Chicken Wing Testing and Weight Grading Device" Electronics 13, no. 6: 1049. https://doi.org/10.3390/electronics13061049

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop