Next Article in Journal
Smart Material Handling Solutions for City Logistics Systems
Next Article in Special Issue
Teacher Evaluation in Primary and Secondary Schools: A Systematic Review of SSCI Journal Publications from 2012 to 2022
Previous Article in Journal
A Preventive Control Approach for Power System Vulnerability Assessment and Predictive Stability Evaluation
Previous Article in Special Issue
Learning Performance Prediction and Alert Method in Hybrid Learning
 
 
Article
Peer-Review Record

Artificial Intelligence-Empowered Art Education: A Cycle-Consistency Network-Based Model for Creating the Fusion Works of Tibetan Painting Styles

Sustainability 2023, 15(8), 6692; https://doi.org/10.3390/su15086692
by Yijing Chen 1, Luqing Wang 2, Xingquan Liu 1,* and Hongjun Wang 2,*
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Reviewer 3: Anonymous
Sustainability 2023, 15(8), 6692; https://doi.org/10.3390/su15086692
Submission received: 29 January 2023 / Revised: 7 April 2023 / Accepted: 13 April 2023 / Published: 15 April 2023

Round 1

Reviewer 1 Report

This paper presents the Tibetan Painting Style Fusion (TPSF) model, which is proposed to solve the problem of single content and similar style of Tibetan-Chinses painting style fusion. Meanwhile, this TPSF model provides a digital method of Tibetan-Chinese painting style fusion, which empowers art education and provides a new interactive learning model for learners. And the appeal conclusion is verified by the experiments. However, there are still some problems.

1) In subsection 3.3, the values of Dy(Y) and Dy(G(x)) converge to 1 and 0, respectively, and their meanings should be explained in detail. For example , what is the exact meaning of Dy(Y) = 1?

2) The reference format needs to be standardized and there are some items missing, e.g. references 18 and 25 are missing page information.

3) There are also some typos, grammatical mistake and non-standard expressions. For eaxmple, ‘Frechet Inception Distance (FID) indicators’, ‘modle’, and ’ Liu L et al. [18]’ etc.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 2 Report

Thank you for this interesting paper where you explore art and AI.

I would like to suggest the following:

Minor edits

line 29: "in the forest of Chinese ethnic painting"  - this is a phrase I do not recognise in English: 'in the forest of...'

line 56: from here, you are describing the fusion model. Please make this a separate section ; it is too detailed for the introduction.

line 104: typo "realated"

line 114: "mobile learning is also a hot topic" - I don't see the relevance to your research topic

 

Major edits

Please include a section at the start of your paper that helps to understand Thangka Painting Style for a global audience. What makes this style so popular? What does it look like / what is typical about it?

Is (non-digital / non-AI) fusion of Thangka Painting Style common? What does "fusion" actually mean - is it like playing the lyrics of one song to another song's melody? Or is it like mixing genres, for example opera with rap? This should be appropriately referenced. (Currently the opening paragraph is under-referenced)

 

Please include more of a rationale for your research topic (possibly in the introduction): Why is AI-fusion of painting style of interest? Why specifically Thangka Painting Style? What is the problem you're trying to solve?

 

For methodology, did any humans evaluate whether the outcome image was good or bad, "had strong visual appeal" (line 363) - just the researcher? It is limited to evaluate a technical operation success solely by one technical metric, especially in making evaluative statements such as it is "better" or has "strong visual appeal".

How does this empower art education?

The most important question to me, about the last two lines of the conclusion

"can help more learners participate in the creation of the national painting, and stimulate more learners to have a strong interest in the study of national painting art."

I am missing the logical link between the image generation, and this educational claim. Why would it help educationally? Why would it stimulate more learners? What is your research evidence for this claim?

 

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Reviewer 3 Report

This is an interesting and technically competent paper on a modern and hot topic.

 

Some remarks:

 

The abstract is just too long:  just describe the essence of the paper with an emphasis on the original contribution (without much detail!) and results. Some abstract stuff (such as sentences between the very first one and “we construct…”) can go to the Introduction.

The list of keywords can be extended – why not mention neural networks, for instance?

The introduction can also be shortened. For instance, the paragraph (starting from line 56) describes the structure of the TPSF model in excessive (for an Introduction section) detail – it is described in subsequent sections anyway.

The related work section provides a good description of the state of the art around the topic and covers a lot of relevant sources.

Section 3 “Design of TPSF model” is a good description of the theoretical and technical details of the model. It is not easy reading, though – but it allows to understand the model, and the illustrations do help.

Section 4 “Experiments and results” provides practical evidence that the proposed model works. The diagrams in Fig. 4 are not easy to interpret, though – perhaps due to their small size. Artistic images are impressive. Perhaps, larger images of several of them will be beneficial.

 

The conclusions are adequate. However, the paper doesn’t justify the word “education” in the title: there is a very limited discussion of how students or pupils can practically work with the system to appreciate it in the context of their learning. I recommend adding at least a small section dealing with that. Or to remove emphasis on education at all.

 

 

Overall, this is a solid paper deserving to be accepted after minor revision.

Author Response

Please see the attachment.

Author Response File: Author Response.pdf

Back to TopTop