Regularized Chained Deep Neural Network Classifier for Multiple Annotators
Round 1
Reviewer 1 Report
The paper proposes a deep network architecture for the task of classification with multiple annotators.
The paper is well written and organized and the experimentation is thorough and the results compelling.
The evaluation is not completely sound given that the results from multiple datasets are averaged and that is meaningless unless the datasets are related. This is not a good practice for comparing methods.
I refer you to the classical paper from Demsar "Statistical Comparisons of Classifiers over Multiple Data Sets" Journal of Machine Learning Research 7 (2006) 1–30, where is recommended the Friedman test over the ranks of the algorithms as a more sound methodology.
There are a couple of typos in the text:
Line 34: "schem" should be "schema"
Line 294: "probes" should be "proves"
Author Response
Please see the attachment
Author Response File: Author Response.pdf
Reviewer 2 Report
The idea presented in the paper is quite interesting and useful.
However, there are some points that require some more work, in my opinion:
- The Introduction section is too long, it would be better to have an overview, or state of the art section
- The definition of the method is easy to follow, but it lacks some insights:
- Could other activation layer be used
- Does the Backpropagation remain the same
- Could the method be adapted to other types of layers (convolution, recurrent, etc)
- The result and discussion sections are well designed, but I would like to see other validation:
- The simulation of the multiple annotators in synthetic datasets is quite simple
- How would the method behave with a malicious annotator, how many malicious annotators are required to decrease performance?
- Other configuration of annotators would help to establish the resilience of the method
- Finally, the GPC-GOLD appears to outperform your proposed solution by a large margin, is there any improvement to be made to overcome this?
Author Response
Please see the attachment
Author Response File: Author Response.pdf
Round 2
Reviewer 2 Report
The authors address my previous concerns.