**2. Materials and Methods**

The data have been acquired using scientific databases through the different tools that these databases make available to us. Currently, access to these databases is restricted to the organizations that have subscribed to them, which limits the use of these sources. There are free access sources to access scientific publications, but the quality of the data is not the same as in the sources that are mentioned below. Logically, access to science is limited for some researchers, but the reality is that the dominance of these resources has made them indispensable for the world of research, becoming official data sources at the institutional and governmental level.

Scopus is the database developed by Elsevier that indexes the content of more than 24,600 active journal titles and more than 194,000 books from more than 5000 publishers. Its historical content dates back to 1788, and currently contains over 75 million articles, 1.4 billion references cited since 1970, over 9.5 million conference proceedings, 437 million patents from the five largest patent offices worldwide, 16 million author profiles, and around 70,000 membership profiles. Therefore, this database has been used in considerable bibliometric work in every field of knowledge [30], including medicine [31,32].

Based on the data from Scopus, Elsevier has developed its own research performance analysis tool: SciVal, offering access to the scientific output of more than 230 countries and 14,000 institutions from 1996 to the present. It should be noted that this database has also been used for studies related to the field of medicine [33]. Therefore, the main source of data for this study has been Scopus, obtained through SciVal.

In order to complete data on the ranking of scientific journals has been used:


To obtain data under analysis, the following search in SciVal has been used as a starting point: "scientific publications in Spain between 1998 and 2018 filtered by ASJC categories". Journal classification approaches perform an essential function in bibliometric analysis [37]. ASJC (All Science Journal Classification) categories is the classification of subjects used by SciVal to categorise Scopus sources and the publications of each of those sources (e.g., journals). Each Scopus source can be assigned to one or more categories in the selected subject classification. Initially there are the four major subject areas: physical sciences, health sciences, social sciences, and life sciences. The ASJC classification has 27 categories (see Table 1) which are further subdivided into various subcategories. Note that multidisciplinary belongs to the four subject areas.



Figure 1 summarizes the methodology. Once the search was performed, it was filtered by the bibliometric marker "Patent-Cited Scholarly Output" for all publication types and for all patent offices. The result provides all publications that have been cited in at least one patent. The coverage of these patents reaches the five largest patent offices: EPO (European patent office), USPTO (U.S. patent office), UK IPO (UK intellectual property office), JPO (Japan patent office), and WIPO (World Intellectual Property Organization).

**Figure 1.** Methodology flowchart. Note: SJR (SCImago Journal Rank); JCR (Journal Citation Reports)

On the basis of these data, the evolution over time from 1998 to 2018 of the publications that have been cited in patents, the contribution of the authors of these publications to the development of patents, as well as the international collaboration between these authors have been analysed.

In the analysis of affiliations, the source data of the analysis has been completed with data from global publications of each affiliation between 1998 and 2018 based on Scopus. The search has been carried out by affiliation, considering the publications that as an institution have been published in each of the universities or R&D centers under study in this date range.

When analyzing the impact of the journal, it has been chosen to analyze the impact of the journal on JCR, based on data from 2018, obtaining the following metric values:


Each journal in JCR is assigned to at least one category and may be classified in more than one category.

• Five-year journal impact factor. This indicator shows the average number of times articles from the journal have been cited in the JCR year over the past five years. It is calculated by dividing the number of citations in the JCR year by the total number of articles published in the previous five years.

When analyzing research topics, SciVal uses so-called Topics. A Topic is a set of documents with a common interest. Topics are based on the grouping of the citation network of 95% of the Scopus content (all documents published since 1996) and are grouped within SciVal based on direct citation analysis using document reference lists, so that a document can belong to only one Topic but as newly published documents are indexed, they are added to the Topics using their reference lists. This makes the Topics dynamic and most of them increase in size over time.

They are obtained from more than one billion citation links between more than 48 million documents indexed by Scopus from 1996 onwards and more than 20 million other non-indexed documents that are cited at least twice. There are approximately 96,000 Topics. Once a year SciVal re-runs the SciVal Topics algorithm to identify newly emerging topics. A combination of the potential for emergence (recent numbers of publications vs. previous years), size of the topic, citations, and funding is considered to rank a new topic. As an example, in 2019, 37 new topics were identified and added to SciVal.

The Topics name is part of the topics cluster name. A topic cluster name is created by adding topics with similar research interests to form a broader, higher-level research area. These topic clusters can be used to gain a broader understanding of the research being carried out by a country, institution (or group) or researcher (or group). Each of the 96,000 topics has been paired with one of the 1500 cluster topics. As with topics, a researcher or institution can contribute to multiple topics, but a topic can only belong to one topic and a publication can only belong to one topic (and therefore to one cluster topic). Clusters topics are formed using the same direct citation algorithm that creates the topics. When the strength of the citation links between the topics reaches a threshold, a cluster topic is formed.

Among all the other possible metrics to evaluate the quality of the journals, it has been chosen the field-weighted citation impact, this is the average number of citations received in relation to the expected ones. Recent studies prove that the FWCI is consistent in different areas of research [38]. Expected citations are calculated for the same year of publication, same type of publication, and same discipline. The benchmark is 1, above which, the expected, and below, it has not reached what was expected.
