Next Article in Journal
AutoFL: Towards AutoML in a Federated Learning Context
Previous Article in Journal
Study of the Overlying Strata Movement Law for Paste-Filling Longwall Fully Mechanized in Gaohe Coal Mine
 
 
Article
Peer-Review Record

A Gaussian Process Decoder with Spectral Mixtures and a Locally Estimated Manifold for Data Visualization

Appl. Sci. 2023, 13(14), 8018; https://doi.org/10.3390/app13148018
by Koshi Watanabe 1, Keisuke Maeda 2, Takahiro Ogawa 2 and Miki Haseyama 2,*
Reviewer 1:
Reviewer 2:
Appl. Sci. 2023, 13(14), 8018; https://doi.org/10.3390/app13148018
Submission received: 12 April 2023 / Revised: 1 July 2023 / Accepted: 5 July 2023 / Published: 9 July 2023

Round 1

Reviewer 1 Report

 

This manuscript introduces a novel visualization-aided dimensionality reduction method based on the Gaussian process latent variable models (GP-LVMs). The proposed method combines the advantages of the spectral mixture (SM) kernel and the locally estimated manifold. By applying the expressive SM kernel function and locally estimated data manifold into GP-LVMs, the proposed method can preserve the local structure of data. Furthermore, the method can preserve the global structure of data due to the Gaussian process-based formulation. In order to reduce the computational complexity when handling datasets with thousands of samples, the authors introduce a sparse method and realize scalable optimization with the lower bound of the log-posterior distribution. In the experiment, the manuscript compares the proposed method with multiple methods, and the results show that the proposed method shows promising results in both global and local quality metrics. The numerical studies of this work are solid, full and clearly show the efficiency of the proposed method.

 

The proposed method is novel and of significance. Therefore, I suggest this manuscript to be accepted after a few revisions.

  

Below the equation (3) of page 4, “d-th row”should be “d-th column”.

  

Below the equation (11) of page 6, “a nearest ”should be “the nearest”.

 

Below the equation (16) of page 6, “q-th row”should be “q-th column”.

 

On the equation (31) of page 10, “r(I,j)”should be “r(i,j)”.

 

In section 5.3.1, how to calculate the Shepard goodness needs to be explained.

 

 

 

On line 14 of page 1, “correltion”should be “correlation”.

On line 19 of page 1, “reducing”should be “reduce”.

On line 147 of page 4, “two input”should be “two inputs”.

On line 148 of page 5, “a number of”should be “the number of”.

On line 209 of page 9, “select”should be “selected”.

 

 

 

 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

This paper introduces a novel approach to dimensionality reduction by combining Gaussian process models with UMAP. The proposed method incorporates a graph Laplacian regularization term based on GPLRF and utilizes the SM kernel for the GP-LVM. An inducing method is also introduced to improve scalability. The paper offers a detailed explanation of the model formulation and the lower bound of the log-posterior distribution. To assess the effectiveness of the proposed method, it is compared against several baseline methods on four real-world datasets. The results demonstrate that the proposed method outperforms the baseline methods in terms of both data visualization and quantitative evaluation metrics. An ablation study is also conducted to demonstrate the novelty of the proposed method. Provides a clear and comprehensive explanation of the proposed method and its advantages over existing methods. The experiments are rigorous, and the results are convincing.

The paper is well-written, there are only a few typos in lines 14-15, 45, and 195. In summary, this paper presents a valuable contribution to the field of dimensionality reduction and is recommended reading for anyone interested in this topic

 

The paper is well-written, there are only a few typos in lines 14-15, 45, and 195.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

A few points in the abstract the last sentence uses "several" two times. This is a bit vague. This is the abstract so please let the reader know exactly what methods and what databases. Again section 5 mentions "several" experiments again be more specific such as 200 experiments. Line 208 you write fashon instead of fashion. Coming to 5.1.2 and 5.1.3 I think some people might confuse comparison methods with evaluation methods (I have used the term comparing visualizations in my previous works) so I think a better wording can be selected for comparison methods. 

As coming to the two methods selected for evaluation I have the most to say but I will refer to previous works.  First in terms of terminology I have not seen Shepard goodness used as it is. Maybe Shepard goodness of fit would be a better choice. In comparing nonlinear mapping procedures various comparison metrics are possible and they are used in the below works:

https://link.springer.com/article/10.1007/s00357-006-0014-2

https://link.springer.com/article/10.1007/s00371-020-01817-5

One is more recent the other one is a bit older.  It is best to state possible comparions/evaluation metrics and then indicate a justification for their use or at least use at least one other metric so the total number of metrics equals at least 3 so you can use a voting rule to see which methods performs best. 

Table 2 and 3 captions are very long this should really be in text. Finally conclusion should be extended a little bit and add limitations if possible. 

Thanks for this interesting work. 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Round 2

Reviewer 3 Report

My comments have all been answered thanks.

Back to TopTop