Next Article in Journal
Classification of Partial Discharge Sources in Ultra-High Frequency Using Signal Conditioning Circuit Phase-Resolved Partial Discharges and Machine Learning
Next Article in Special Issue
Concept and Realisation of ISFET-Based Measurement Modules for Infield Soil Nutrient Analysis and Hydroponic Systems
Previous Article in Journal
An Approach to Deepfake Video Detection Based on ACO-PSO Features and Deep Learning
 
 
Article
Peer-Review Record

Validation Scores to Evaluate the Detection Capability of Sensor Systems Used for Autonomous Machines in Outdoor Environments

Electronics 2024, 13(12), 2396; https://doi.org/10.3390/electronics13122396
by Magnus Komesker 1,2,*, Christian Meltebrink 2, Stefan Ebenhöch 2, Yannick Zahner 2, Mirko Vlasic 2 and Stefan Stiene 1
Reviewer 1: Anonymous
Reviewer 2:
Reviewer 3: Anonymous
Electronics 2024, 13(12), 2396; https://doi.org/10.3390/electronics13122396
Submission received: 15 May 2024 / Revised: 8 June 2024 / Accepted: 17 June 2024 / Published: 19 June 2024
(This article belongs to the Special Issue Intelligent Sensor Systems Applied in Smart Agriculture)

Round 1

Reviewer 1 Report

Comments and Suggestions for Authors

This paper introduces new validation scores that evaluate the detection capability of optical sensors intended to be applied to outdoor machines.
This was achieved by developing an extension to the new Real Environment Detection Area (REDA) method that was previously published by two authors of this submission. There are some aspects of the proposition that are unclear to me and requires further explanation.

1) "Thus, the exemplary real-world situation can be described with the matrices G_{p},E_{p}" - what are those matrices? In Figure 9 there are some examples of detection goals, environment noise and sensor configuration, but it is unclear to me how to transform them to matrices. Are those "matrices" matrices in mathematical sense, or merely list of lists of some parameters and metadata?

2) Figure 11,14 - the point cloud / READ point cloud is rather unrealistic. If you use nearly every type of distance sensor (ultrasound, laser, visual), if there is an obstacle, you cannot determine what is behind this obstacle (if it is an empty region or not):

11111
10xxx
11111

(x is invisible region 0 is an obstacle). How do you deal with this situation? How this affect calculation of (3-5)?

3) Figure 13 - how you determine the granulation (size of sampling) of the matrix? Do you take into account deterioration of detection over distance? Do you take into account the speed of the machine?

4) Equation (1), (2) - multiplication with "x" symbol is often reserved to cross (vector) product. Please change it to dot.

5) You use standard TP, FP, FN and TN notation for detection evaluation.
- why not to call (3) accuracy?
- why not to call (4) recall?
- why not to call (5) negative predictive value?

6) Are (3-5) metrics or only scores?

7) Why not to use precision, recall, F1 score instead of (3-5)?

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Reviewer 2 Report

Comments and Suggestions for Authors

The paper presents three validation scores (usability, availability and reliability) to quantify the detection capability of computer vision systems mounted on autonomous machines in outdoor environments, in particular in presence of harsh environmental conditions. The article continues the research line presented in a previous work of the same authors (Ref. [5]). I think the paper must undergo major revisions, since in the present form it is not easy to follow the presented discussion. These are the major weaknesses that must be addressed by the authors.

1) The paper discusses the problem to quantify the detection capability of computer vision systems in a very abstract way and it is not easy for the non-specialist to follow the discussion. I think the authors should present a real case application and clearly show how the proposed formalism applies to the example case study.

2) Line 241. I do not understand the sentence “That fits with the definition of the SRS description in 4 shown in chapter 2.1”.

3) The presentation of the REDA and REDAM methods in section 2.3.2 is not clear and it is not easy to understand what Fig. 6 and Fig. 7 refer to. Again in this case, showing how the general REDA and REDAM methods apply to a real case study can help to make the discussion clearer.

4) In section 5 the proposed metrics are presented. However, it is not clear how these metrics can improve the detection capability of computer vision systems. What are the metrics currently used? How the proposed metrics improve the detection capability? Please, discuss how the proposed metrics help to improve the current state-of-the-art for detection capability of computer vision systems.

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Reviewer 3 Report

Comments and Suggestions for Authors

The author present a paper aimed to introduce a novel validation scores for the detection capability of optical sensor used in autonomous machines working in outdoor.  They proposed a link between of Real Environment Detection Area method  and the sensor standard IEC 62998. Resulting strategy includes static analysis based on Usability, Availability and reliability scores. The authors report an application of the method to agricultural sector.

 

The manuscript, in my opinion is well organized in particular in the introduction are well presented the problem in real applications with the description of all limitations of the actual methods. However, in my opinion the paper needs some minor revisions as following reported:

- in section 2.1.2. Sensor-related standards, the authors should add some comments about how possible environments interferent (for example dust) is taken into account in the model in figure 4. These information could increase the reader text comprehension. 

- the resolution and axes text of figure 6,7 should be increased.

- The authors should add some comments during the section 4 where the time of response is discussed in particular if there are some limitations about the time point of view (e.g. sensors time responsivities and response time of whole elaboration).

Author Response

Please see the attachment

Author Response File: Author Response.pdf

Round 2

Reviewer 1 Report

Comments and Suggestions for Authors

The authors have addressed all my remarks. In my opinion, paper can be accepted as it is.

Reviewer 2 Report

Comments and Suggestions for Authors

The Authors addressed my comments. The paper can be accepted for publication.

Back to TopTop