Russian University Teachers’ Perceptions of Scientometrics
Round 1
Reviewer 1 Report
The topic of the article is interesting and highlights, on the one hand, the need to contextualize the use of scientometric indicators and, on the other hand, the level of knowledge of the Russian scientific community regarding the evaluation of research by means of scientometrics.
However, we found important shortcomings in terms of the formal structure of the article as well as lack of context.
The introduction has some minor inaccuracies. I point out some examples:
in lines 21-23: “Scientometrics is a special discipline that conducts research of scholarship using mathematical 22 methods, data collecting, and statistical processing of bibliographic information (the number of published scientific papers, their citations, etc.)”
This is incorrect, the following are some of the many documents in which it can be seen that the number of scientometric indicators is considerably higher.
Glänzel, W., Moed, H. F., Schmoch, U., & Thelwall, M. (Eds.). (2019). Springer Handbook of Science and Technology Indicators. Springer Nature.
Wildgaard, L., Schneider, J. W., & Larsen, B. (2014). A review of the characteristics of 108 author-level bibliometric indicators. Scientometrics, 101(1), 125-158.
Todeschini, R., & Baccini, A. (2016). Handbook of bibliometric indicators: Quantitative tools for studying and evaluating research. John Wiley & Sons.
in lines 24: “The origin of scientometrics dates back to before the Second World War.” Scientometrics origins, as Science of science, dates afeter Second World War.”
The origin of bibliometrics dates back to the end of the 19th century, but scientometrics did not emerge until the mid-1950s.
It would be interesting if the problems of the indiscriminate use of scientometrics and the inappropriate application of scientometric indicators in the evaluation of research results were mentioned more systematically in the introduction.
The second section of the article (“2. The Main Scientometric Indicators, Their Objectivity and Forced Introduction of Scientometric Indicators in Russia”) provides an overly superficial and biased description of the The Main Scientometric Indicators, however, the references to the application of scientometrics in Russia are very interesting. I would suggest that the authors develop the first point further.
As an example of what I mean, I refer to lines 75-78 where it says:
“Finishing the story with the analysis of the problem of registration and accounting of the total number of publications, it is impossible not to touch upon the problem of co-authorship.” It is true that multi-authorship and fraud in authorship have caused problems, but nothing is mentioned about the fractional counting methods that have been developed to alleviate these problems.
In lines 87-89 : “scientometrics offers another indicator that is designed to solve the problem of the quality of scholarly work. This is the citation” The article I refer to below shows that there are many impact indicators. On the other hand, the indicators of quality of academic work are based on peer review, so it would be more appropriate to say that scientometrics works with indicators of research impact.
Waltman, L. (2016). A review of the literature on citation impact indicators. Journal of informetrics, 10(2), 365-391.
The third part of the article should be preceded by a section on methodology.
Nowhere in the article are other forms of science assessment in Russia and their comparison with scientometric assessment.
The article does not provide background information to compare the results of their research with the problems of applying scientometric methods to other non-English-speaking countries.
Author Response
Thank you! Please see the attachment
Author Response File: Author Response.pdf
Reviewer 2 Report
Publications (ISSN 2304-6775)
Manuscript ID publications-1194243
Title
Scientometrics in Russia and the Attitude of the Faculty of Leading Russian Universities to It.
This work is focused on testing the knowledge of a sector of the university population, the teaching staff, about scientometrics. Although it deals with Russia, most of the focus is on the University of St. Petersburg. The manuscript has the potential to be published, but some issues need to be improved, and since there are many, I propose a major revision of the manuscript.
The abstract should be reviewed in its entirety as it does not provide sufficient information to know the content of the work carried out. I propose to provide exact and not approximate data. Eg. 283 respondents.
Keywords do not reflect the content of the work. E.g. national specifics, this says nothing. Please check.
A table of abbreviations would be recommended to make it easier for the reader to understand the terms used. For example, HSPU is not easy to know what it is.
The first sentence contains an overuse of references. Seven references are not needed for this. I request that the authors leave only one of the references they have proposed.
In the introduction it focuses on scientometric aspects, which is well known but can be considered adequate. However, what is missing is an overview of Russian universities, number of universities, number of students, number of professors, ... research expenditure. All these data in the introduction would help to better frame the problem to be studied.
The authors do not clearly define the objective of the work. This should be reflected at the end of the introduction.
Line 123. “citations each and the other (Np-h) papers have ≤h citations each [16] (16569)”. ¿que significa este ultimo numero? What does this last number mean?
Line 124 -125. There is no need to set an example for the H-index. Remove.
The results are explained. But I propose to present the data also in tables:
Table of questions conducted, even if only as an appendix.
Distribution of the population, distribution of the universities surveyed.
Results for each question.
Do you have information on the scientific field to which the respondents belong? This is of great importance, as the fields of science are not the same as those of social sciences or humanities.
Survey results, a graph illustrating the results, would help to give more visibility to the manuscript.
Once the data has been presented, some kind of statistical analysis of the data should be provided, and not just the data itself.
Avoid writing in the first person: "we".
Finally, I think that the title of the paper should focus on what has been done, something like: Russian university teachers' perceptions of scientometrics.
Finally, although the work cannot formally be said to be plagiarised, as it has a 21% similarity with a source of the main author, the introduction should be rewritten so that it does not overlap so much with a previous work.
Grinäv, A. V. (2020, September). The Disadvantages of Using Scientometric Indicators in the Digital Age. In IOP Conference Series: Materials Science and Engineering (Vol. 940, No. 1, p. 012149). IOP Publishing.
I provide the plagiarism report to help the authors rewrite this part.
Comments for author File: Comments.pdf
Author Response
Thank you! Please see the attachment
Author Response File: Author Response.pdf
Round 2
Reviewer 2 Report
The manuscript has been improved according to my suggestions. Apart from minor editorial issues, I would just like to point out the following:
1- In the tables added as appendices, the decimal separator is not correct, it should be a point and the authors use a comma.
2- In figures 1 and 2, there is no need for decimals on the Y-axis.
3- Figure 2 should not pass the y-axis of 100%.