Next Article in Journal
Chatter Detection in Robotic Milling Using Entropy Features
Previous Article in Journal
Research on Unsupervised Classification Algorithm Based on SSVEP
 
 
Review
Peer-Review Record

A Global Analysis of Research Outputs on Neurotoxicants from 2011–2020: Adverse Effects on Humans and the Environment

Appl. Sci. 2022, 12(16), 8275; https://doi.org/10.3390/app12168275
by Zikhona Tywabi-Ngeva 1, Abiodun Olagoke Adeniji 2,* and Kunle Okaiyeto 3
Reviewer 1: Anonymous
Reviewer 2: Anonymous
Appl. Sci. 2022, 12(16), 8275; https://doi.org/10.3390/app12168275
Submission received: 8 September 2021 / Revised: 28 September 2021 / Accepted: 29 September 2021 / Published: 18 August 2022

Round 1

Reviewer 1 Report

ABSTRACT

The summary shows the following: “Data on this subject were obtained from the SCI-Expanded of Web of 19 Science” while in section 2.1 the following is shown: Indexes: SCI-EXPANDED, SSCI, A&HCI, ESCI. Which is correct?

The highest 21 publications were recorded in 2019 (n = 19)?? However, section 3.2 states the following: The year 2020 recorded the highest outputs of 40 documents (12.5%).

The abstract says nothing about the second part of the study: networks.

 

MATERIALS AND METHODS

Nothing is stated about data cleaning. How author names, keywords, etc. have been cleaned.

 

RESULTS AND DISCUSSION

I do not understand the difference between: Authors of single-authored documents (21) and Single-authored documents (24).

Why is co-authors per document higher than authors per document? How is collaboration index calculated?

Figure 1 shown in section 3.2 is missing.

Figure 2 shown in section 3.2 is missing.

In section 3.3 (and throughout the article) the same information already shown in the table is written (it is not necessary). The top 10 among these authors are…

The h-index, at what date is it calculated, April 12? (the same in table 4)

Both the h-index and the TC are for all 321 documents, right?

In table 3 it would be good to include two more columns: Country and type of organization.

Section 3.5 states the following: “neurotoxicology had both the highest number of articles and total citations, hence, it is the most impactful among the top 20 journals in this field of study”, however, there are journals with a better ratio: NP/TC.

The meaning of tables 5 and 6 is not understood (The difference between both of them). The top 20 cited articles within a particular country (locally) and internationally ¿???

Although it is done throughout the text, it would be good to add a new column in tables 5 and 6 with the main objectives of the different articles. It would also be good to include the titles of the articles.

In section 3.7 “table 4” should be changed to “table S1”

Why table S1 is not include in the main text?

In figure 1 why are Canada, Denmark and Poland connected to other countries if they have no international collaboration?

In relation to figure 2 the following is indicated: “The authors with the similar or same colour are in 446 the same cluster”. That being the case, why are there no connections between members of the same cluster? Is it correct to use the word "similar"? shouldn't it just be "same"?

What were the criteria for selecting the nodes?

 

REFERENCES

It is missing references that contextualize bibliometrics such as:

Holmberg, Kim (2015). Altmetrics for information professionals: Past, present and future. ISBN: 978 0 081002773

Zarrabeitia-Bilbao, Enara; Álvarez-Meaza, Izaskun; Río-Belver, Rosa-María; Garechana-Anacabe, Gaizka (2019). “Additive manufacturing technologies for biomedical engineering applications: Research trends and scientific impact”. El profesional de la información, v. 28, n. 2, e280220

Author Response

Sincere appreciation to the reviewer for the thorough work done. Attached is our response. Thank You.

Author Response File: Author Response.doc

Reviewer 2 Report

  1. I am not sure about the data source of this study. Line 160 has indicated that “SCI-EXPANDED, SSCI, A&HCI, ESCI” have been used. However, it is controversy with the abstract section. Besides, the sub-datasets of Web of Science and corresponding coverage timespans should be specified as indicated in “The data source of this study is Web of Science Core Collection? Not enough. Scientometrics 121, 1815–1824 (2019).”

 

  1. It is not precise to state “Web of Science (WoS) stands out as a unique, all-inclusive, and reliable database for this purpose, given its accommodation of several studies (millions in more than 12,000 journals) with high impact factors and extremely good quality across the globe [26, 27].” Even though many references have been cited in this paper, I recommend the authors replace them with some normative and classical literature published in authoritative journals in the field of Scientometrics.
  2. Why document types such as correction, meeting abstract, and news items should be included in the analysis?
  3. Figures are not reader friendly/informative.

Author Response

Sincere appreciation to the reviewer for the thorough work done. Attached is our response. Thank You.

Author Response File: Author Response.docx

Round 2

Reviewer 2 Report

Figures are not reader friendly/informative. Please fix them with other softwares.

Author Response

Reviewer's Comment: Figures are not reader friendly/informative. Please fix them with other softwares.

Authors' Response: All the figures have been reproduced and are now reader friendly. Many thanks to the reviewer for the great effort in making the paper a worthy one.

Back to TopTop