Next Article in Journal
Oil Quality Prediction in Olive Oil by Near-Infrared Spectroscopy: Applications in Olive Breeding
Next Article in Special Issue
Research and Preliminary Evaluation of Key Technologies for 3D Reconstruction of Pig Bodies Based on 3D Point Clouds
Previous Article in Journal
A Comparative Analysis of Microbial Communities in the Rhizosphere Soil and Plant Roots of Healthy and Diseased Yuanyang Nanqi (Panax vietnamensis) with Root Rot
 
 
Article
Peer-Review Record

A Point Cloud Segmentation Method for Pigs from Complex Point Cloud Environments Based on the Improved PointNet++

Agriculture 2024, 14(5), 720; https://doi.org/10.3390/agriculture14050720
by Kaixuan Chang 1, Weihong Ma 2,3,4,*, Xingmei Xu 1, Xiangyu Qi 3, Xianglong Xue 2,3,4, Zhankang Xu 2,5, Mingyu Li 2,3,4, Yuhang Guo 2,3,4, Rui Meng 2,3,4 and Qifeng Li 1,2,3,4,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Reviewer 4:
Agriculture 2024, 14(5), 720; https://doi.org/10.3390/agriculture14050720
Submission received: 27 February 2024 / Revised: 6 April 2024 / Accepted: 29 April 2024 / Published: 2 May 2024
(This article belongs to the Special Issue Application of Sensor Technologies in Livestock Farming)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

This study addresses the significant challenges of live pig point cloud segmentation in complex farm environments, which include dynamic animal behaviors and environmental changes such as concealment by objects and posture changes. The specific difficulties mentioned include situations where pigs lick railings and defecate within the acquisition environment.

The study presents an innovative approach and an effective solution to the challenges of point cloud segmentation in complex agricultural environments. However, it would be beneficial to explore the scalability of the improved model and its applicability in other livestock farming contexts and in the segmentation of different species. Furthermore, the integration of additional deep learning techniques to further improve the accuracy and efficiency of the model could be an interesting area for future research. Evaluation of the model's robustness to extreme variations in animal behavior and environmental conditions would also be crucial for its practical implementation.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

1.       Why only 529 samples are taken.

2.       Is the proposed method implemented? How efficient is this work in real time implementation?

3.       Figure 1 quality has to be improved particularly below the block diagram

4.       PLC is not defined in Introduction

5.       Highlight the novelty and contribution

6.       In figure 7 the input and output are not mentioned?

7.       Justify the need for deep learning for only a 529 dataset?

8.       From the equations are referred?

9.       Comparisons of the proposed method with existing methods are missing. Only the work is compared with the pointnet++

10.   Performance indicators in Figure 10 shows only 2 methods. Some more existing methods can be compared.

11.   How Table 3 outputs are validated.

12.   What is the space and time complexity of the proposed method.

13.   Limitations and scope are not mentioned

Comments on the Quality of English Language

Minor editing

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

Comments and Suggestions for Authors

I tried to find the preprocess the raw point cloud data before applying the segmentation method, especially in the context of pig environments but i couldnt.

The motivation in the introduction is very week

There is no improvement done in the context of the improved PointNet++ algorithm, what modifications or enhancements have been made to better handle the complexities of pig environments

The author needs to specify in Fig.1 as well as section .3 results and discussion which specific techniques are used to optimize the computational efficiency of the segmentation method, particularly considering the large volumes of point cloud data.

There is very limited references

Comments on the Quality of English Language

Focus more on Quality of English Language

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 4 Report

Comments and Suggestions for Authors

1. Provide more detailed explanations of the adjustments made to the traditional PointNet++ model parameters and the rationale behind these adjustments. This would help readers understand the specific enhancements made to make the model more suitable for pig segmentation.

2.Provide a brief explanation of why the improved PointNet++ network outperforms the base network. This could include discussing the specific enhancements made to the model architecture or training process.

3. Expand the discussion to analyze the implications of the observed differences in mIoU and accuracy between the two networks. Discuss how these findings contribute to the understanding of point cloud segmentation for live pigs in farming environments.

4.Acknowledge the remaining challenges, such as the insufficient segmentation of parts of the pig body protruding from the pen, and discuss potential strategies for addressing these challenges in future research.

Comments on the Quality of English Language

Minor editing of English language required

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 2 Report

Comments and Suggestions for Authors

7. Justify the need for deep learning for only a 529 dataset? authors have justified the data.

8. Equations have to be cited by the references

11. How Table 3 outputs are validated. Authors still have not clarified. Authors have pointed Table 3 and Figure 12 which dont discuss the same parameter

12. What is the space and time complexity of the proposed method. Authors reply is not satisfactory. Quantify 

Comments on the Quality of English Language

Minor

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Back to TopTop