Next Article in Journal
Supply Chain Modelling of the Automobile Multi-Stage Production Considering Circular Economy by Waste Management Using Recycling and Reworking Operations
Previous Article in Journal
Consumer Acceptance and Preference for Camel Milk in Selected European and Mediterranean Countries
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Best Practices in Knowledge Transfer: Insights from Top Universities

by
Annamaria Demarinis Loiotile
1,2,
Francesco De Nicolò
1,2,
Adriana Agrimi
3,
Loredana Bellantuono
4,5,*,
Marianna La Rocca
1,6,
Alfonso Monaco
1,5,
Ester Pantaleo
1,
Sabina Tangaro
5,7,
Nicola Amoroso
5,8 and
Roberto Bellotti
1,5
1
Dipartimento Interateneo di Fisica, Università degli Studi di Bari Aldo Moro, 70126 Bari, Italy
2
Dipartimento di Ingegneria Elettrica e dell’Informazione, Politecnico di Bari, 70125 Bari, Italy
3
Direzione Ricerca, Terza Missione e Internazionalizzazione, Università degli Studi di Bari Aldo Moro, 70121 Bari, Italy
4
Dipartimento di Biomedicina Traslazionale e Neuroscienze (DiBraiN), Università degli Studi di Bari Aldo Moro, 70124 Bari, Italy
5
Istituto Nazionale di Fisica Nucleare, Sezione di Bari, 70125 Bari, Italy
6
Laboratory of Neuro Imaging, USC Stevens Neuroimaging and Informatics Institute, Keck School of Medicine, University of Southern California, 2025 Zonal Avenue, Los Angeles, CA 90033, USA
7
Dipartimento di Scienze del Suolo, della Pianta e degli Alimenti, Università degli Studi di Bari Aldo Moro, 70126 Bari, Italy
8
Dipartimento di Farmacia-Scienze del Farmaco, Università degli Studi di Bari Aldo Moro, 70125 Bari, Italy
*
Author to whom correspondence should be addressed.
Sustainability 2022, 14(22), 15427; https://doi.org/10.3390/su142215427
Submission received: 8 October 2022 / Revised: 1 November 2022 / Accepted: 15 November 2022 / Published: 20 November 2022

Abstract

:
The impact of knowledge transfer induced by universities on economy, society, and culture is widely acknowledged; nevertheless, this aspect is often neglected by university rankings. Here, we considered three of the most popular global university rankings and specific knowledge transfer indicators by U-multirank, a European ranking system launched by the European Commission, in order to answer to the following research question: how do the world top universities, evaluated according to global university rankings, perform from a knowledge transfer point of view? To this aim, the top universities have been compared with the others through the calculation of a Global Performance Indicator in Knowledge Transfer (GPI KT), a hierarchical clustering, and an outlier analysis. The results show that the universities best rated by global rankings do not always perform as well from knowledge transfer point of view. By combining the obtained results, it is possible to state that only 5 universities (Berkeley, Stanford, MIT, Harvard, CALTEC), among the top in the world, exhibit a high-level performance in knowledge transfer activities. For a better understanding of the success factors and best practices in knowledge transfer, a brief description of the 5 cited universities, in terms of organization of technology transfer service, relationship with business, entrepreneurship programs, and, more generally, third mission activities, is provided. A joint reading of the results suggests that the most popular global university rankings probably fail to effectively photograph third mission activities because they can manifest in a variety of forms, due to the intrinsic and intangible nature of third mission variables, which are difficult to quantify with simple and few indicators.

1. Introduction

It is widely recognized that universities contribute to the social, economic, and cultural development of the regions in which they operate [1,2,3,4,5,6,7]. They are devoted to teaching and research activities, but also to carry on the so called “third mission”, portrayed as a contribution to society [8,9,10,11,12].
Although the “third mission” considers a wide set of activities that can be classified into the social, enterprising, and innovative dimensions [13,14], the focus of this paper is devoted to the more traditional activities related to Knowledge Transfer (KT) [15,16,17,18,19]. KT aims to maximize the two–way flow of technology, Intellectual Property, and ideas. On one side, research organizations advance in research and teaching; on the other side, the business and public sector drive innovation, a process that leads to overall economic and social benefit [20,21]. It is an essential source of innovation and a mechanism for the dissemination of research results [22] that has changed significantly in the past year from the more traditional concept of commercialization and monetization towards a “more rounded approach which supports both co–creation and the dissemination of research results with, and to, non–academic third parties” [22].
Today KT is a straightforward activity that includes research collaboration with industry, patenting, consultancy activity, licensing of new technologies, startup and spinoff creation, entrepreneurship programs, incubation, and more [23]. For this aims European Commission has defined a number of policies, to promote KT through different channels, from Intellectual Property (IP) management and citizen engagement to academia-industry collaborations.
Furthermore, previous research has shown that the impact of research on the economy and the society can come from intentional and unintentional mechanisms [24]. The KT process, defined as the movement of know-how, technical knowledge, or technology from one organizational setting to another [25], represents the intentional way, which implies transaction costs to transfer technology from one organization to another. Unintentional mechanisms include positive externalities and other types of spillovers [26,27] without associated transaction costs.
In this complex scenario, while performance indicators relative to research (in terms of quality of publications and number of citations) and teaching (in terms of student-to-staff ratio, student evaluation, etc.) are widely known and used, less is known about how KT can be characterized and evaluated [23,28]. Furthermore, most of the best-known global university rankings completely lack instruments to evaluate KT activities [29,30]. Recently the European Commission [31] has invested a significant effort in producing the U-Multirank (UMR) ranking [32]. It is based on a different approach compared with the existing global university rankings and furthermore it includes a set of indicators focused on KT.
The research question addressed in this paper is: how do the world’s top universities, evaluated according to global university rankings, perform from a KT point of view? Accordingly, we:
-
Identify and analyze three of the best-known global university rankings in order to identify the world’s top universities and, at the same time, evaluate the coherence between rankings;
-
Search and select a set of specialized KT indicators for evaluating the world top universities from the KT point of view;
-
Verify if the world’s top universities, according to the global rankings, continue to best perform from KT point of view.
The paper is organized as follows. Section 1 contains the information about the global university rankings analyzed in the paper (Academic Ranking of World Universities (ARWU), QS World University Rankings® (QSWUR) and Times Higher Education World University Rankings (THEWUR)) and the reference to U-Multirank with its specific knowledge transfer indicators. Section 2 illustrates the fourth-steps methodology used in terms of Global Performance Indicator in Knowledge Transfer (GPI KT), for the top 10 universities and the top 100 universities in the rankings, the hierarchical clustering and outlier analysis. Section 3 presents the methodology application and the results obtained, subdivided into different steps, which are the classification of the world top 10 Universities, their classification with respect to U-Multirank KT indicators, the performance of universities according to GPI KT, the hierarchical clustering results, and the outlier analysis results. Section 4 contains discussion of the results and a description of the “successful cases”, i.e., the organization of technology transfer service, relationship with business, entrepreneurship programs and, more generally, third mission activities adopted by the 5 best universities. Section 5 illustrates the conclusion of the work.

1.1. Global University Rankings

Nowadays, rankings permeate multiple sectors and address multiple dimensions of individual and organizational behavior [33]. There is a wide range of public measures such as different types of ratings, benchmarks, and rankings, especially for universities [34,35]. Broader access to higher education has resulted into an increasing demand for information on academic quality and has led to the development of university ranking systems or league tables in many countries around the world [36,37]. The increase in the relevance of rankings of academic institutions is a fairly recent phenomenon that has emerged since the late 1980s [38], primarily because prospective students wanted to be informed about academic quality [36]. However, rankings have been increasingly used to compare and quantify success far beyond the question of student choice, and now influence researchers, employers, public opinion, and, most importantly, academic evaluators, governments, and companies [39,40,41,42,43]. They have become a tool for promoting the growth of universities in the international context, increasing their competitiveness, enhancing the attractiveness of the educational and research system [44], namely a true marketing, benchmarking, and branding tool [29].
Although the scientific literature has already highlighted the presence of some critical aspects related to university rankings, such as the inhibition of regional contributions from universities [45], in recent years university rankings have been able to create impact [42,46] and influence higher education, policy, and public opinion [47,48].
Numerous global university rankings have been proposed so far, they are based on different parameters and indicators [41,49,50] and try to provide an all-round evaluation. Some of the most well-known by insiders and popular are the following:
  • The Academic Ranking of World Universities (ARWU) was first published in June 2003 by the Center for World-Class Universities (CWCU), Graduate School of Education (formerly the Institute of Higher Education) of Shanghai Jiao Tong University, China, and updated on an annual basis. Since 2009 the Academic Ranking of World Universities (ARWU) has been published and copyrighted by Shanghai Ranking Consultancy, a fully independent organization on higher education intelligence that is not legally subordinated to any universities or government agencies. ARWU uses six objective indicators to rank world universities, including the number of alumni and staff winning Nobel Prizes and Fields Medals, the number of highly cited researchers selected by Clarivate Analytics, the number of articles published in journals such as Nature and Science, the number of articles indexed in Science Citation Index—Expanded and Social Sciences Citation Index, and the per capita performance of a university. More than 1800 universities are ranked by ARWU every year and the best 1000 are published [51].
  • The QS World University Rankings® (QSWUR) lists and ranks over 1000 universities from around the world, covering 80 different locations; it continues to rely on a remarkably consistent methodological framework, compiled using six simple metrics that effectively capture university performance. Universities are evaluated according to the following six metrics: Academic Reputation, Employer Reputation, Faculty/Student Ratio, Citations per faculty, International Faculty Ratio, International Student Ratio [52].
  • The Times Higher Education World University Rankings (THEWUR) includes almost 1400 universities across 92 countries, standing as the largest and most diverse university rankings ever to date. It is based on 13 carefully balanced and comprehensive performance indicators and is trusted by students, academics, university leaders, industry, and governments. Its performance indicators are grouped into five main areas: teaching (the learning environment); research (volume, income, and reputation); citations (research influence); international outlook (staff, students, and research) and industry income (knowledge transfer) [53].

1.2. U-Multirank 2020 (UMR)

Very often the global university rankings have been criticized [29,37,40,50] because, for example, using a single set of indicators, they compare different types of institutions [42], evolving from a “semi-academic exercise” to an international business tool [41] and an important “instrument for the exercise of power” [48]. To overcome these limitations, in the last few years, other rankings have been proposed. The most important and popular is UMR, launched by the European Commission in collaboration with Bertelsmann Foundation and Banco Santander, based on the results of a feasibility study covering 150 universities which was carried out in 2010/11 [54].
UMR is based on a different approach compared with the existing global university rankings. It compares university performances considering different activities that they are engaged in, taking into account the diversity of the higher education sector and the complexity of evaluating educational performance [32,55,56]. UMR has been developed based on a number of design principles, user-driven, multidimensionality, comparability, multilevel nature of higher education, and methodological soundness [57]. It is considered a transparency tool for higher education stakeholders [58] and takes into account five aspects and dimensions of the universities’ performance: (1) teaching and learning, (2) research, (3) knowledge transfer, (4) international orientation and (5) regional engagement. The UMR web tool allows users to compare universities but also study programs. Based on empirical data, it compares institutions with similar profiles (‘like-with-like’) and allows users to develop their own personalized rankings by selecting indicators in terms of their own preferences. Each of the five aspects evaluated by UMR are ranked from A to E, with A and E indicating “very good” and “weak” performance, respectively.
In this paper, the choice to use of UMR was motivated by the fact that it uses a focused quality model for an in-depth evaluation of KT dimension based on the following indicators [59]:
  • Co-publications with industrial partners: the percentage of a department’s research publications that list an author affiliated with an address that refers to a for-profit business enterprise or private sector R&D unit (excluding for-profit hospitals and education organizations).
  • Income from private sources: the percentage of external research revenues (including not-for-profit organizations) coming from private sources, excluding tuition fees. Measured in €1.000s using Purchasing Power Parities and computed per FTE (full time equivalent) academic staff.
  • Patents awarded (absolute numbers): the number of patents assigned to inventors working at the university in the respective reference period.
  • Patents awarded (size-normalized): the number of patents assigned to inventors working at the university over the respective reference period, computed per 1.000 students to take into consideration the size of the institution.
  • Industry co-patents: the percentage of the number of patents assigned to inventors working at the university during the respective reference period, which were obtained in cooperation with at least one applicant from the industry.
  • Spinoffs: the number of spinoffs (i.e., firms established on the basis of a formal KT arrangement with the university) recently created by the university (computed per 1000 FTE academic staff).
  • Publications cited in patents: the percentage of the university’s research publications that were cited in at least one international patent (as included in the PATSTAT database).
  • Income from continuous professional development: the percentage of the university’s total revenues that is generated from activities delivering Continuous Professional Development courses and training.
  • Graduate companies: the number of companies newly founded by graduates and computed per 1000 graduates.

2. Methodology

The research question (RQ) addressed in the paper is: how do the world top universities, evaluated according to global university rankings, perform from a knowledge transfer point of view?
And after having answered to the RQ, the final goal is to identify the best practices in knowledge transfer that can be adopted by universities that want to improve their performances.
In order to answer to RQ the methodology described in this section (Figure 1) was defined and followed.
The first step was to identify and analyze three of the best-known global university rankings in order to extract the world top universities and, at the same time, evaluate the coherence between rankings.
The rankings selected are the Academic Ranking of World Universities (ARWU), the QS World University Rankings® (QSWUR) and the Times Higher Education World University Rankings (THEWUR), as described in the previous section.
Each ranking has its own specificities; thus, to define “top universities”, we considered the union Tk of all three rankings in 2020, so that:
T k = T k ARWU T k QSWUR T k THEWUR
where TkARWU, TkQSWUR and TkTHEWUR are the top k universities in the rankings QRWU, QSWUR and THEWUR.
The sets T10 and T100 were then determined and furthermore the coherence between rankings was tested by using Spearman’s correlation among the top 100 positions in QRWU, QSWUR and THEWUR.
The second step was to search and select a set of specialized KT indicators for evaluating the world top universities from the KT point of view. For this aim, the UMR2020 previously described was used. Among the 9 UMR2020 indicators, the following 5 were selected for being used for the T10 and T100 elaboration; this is due to the fact that they had the minimum number (less than 8%) of null value:
(1)
Co-publications with industrial partners
(2)
Patents awarded (absolute numbers)
(3)
Patents awarded (size-normalized)
(4)
Industry co-patents
(5)
Publications cited in patents
The five indicators were quantified for all universities included in T10 and T100, and a composite indicator called Global Performance Indicator KT (GPI KT), obtained as the average of the 5 previous indicators, was also defined and used for determining the global performance in knowledge transfer for each university. Then, a comparison was made between the T10 obtained from the global universities ranking and the top performer universities in KT included in T100, also by using radar plots for graphically expressing the macroscopic differences.
The goal of this analysis is to verify if the world top universities, according to the global rankings, continue to best perform from KT point of view.
The third step was to investigate the universities included in T100 in order to identify groups of similar universities, in terms of KT indicators, through a data-driven approach based on the hierarchical clustering. The goal is to understand from the natural aggregation in groups the presence of common characteristics capable of explaining the different levels of performance in knowledge transfer. Hierarchical clustering algorithms allow to group similar items in an unsupervised way [60]. Compared with optimization-based clustering methods, such as K-means [61], this particular class of algorithms follows an alternative approach that entails the advantages of being deterministic and not requiring to fix the number of clusters a priori. An agglomerative hierarchy linkage algorithm was used, which starts by considering each point, corresponding to a data vector, as a cluster, and proceeds by iteratively merging the closest pairs of clusters, until ending up with one cluster that includes all data points. Vicinity of two points i and j is quantified by their Euclidean distance | | i j | | 2 , while the distance between clusters A and B is evaluated as d A , B = min i A , j B | | i j | | 2 , namely the minimum distance between points in the two clusters. The algorithm used allows to obtain dendrograms, which can help with the interpretation of the results.
As a last step, we tried to understand how the universities included in T10 perform if compared with the others in T100 for each of the 5 UMR2020 indicators. This is in order to understand from the KT perspective what are the strengths of those universities that the global rankings identify as the best performers. For all the universities in T100 and for each of the 5 UMR 2020 indicators, the median absolute deviations (MAD) criterion [62] was used in order to compare each university in T10 with the distribution of the remaining universities in T100. In details, we computed the scaled MAD factor for each KT indicator as:
MAD =   c ·   median   T 100 median T 100 ,
where c = −1/( 2 ·erfcinv(3/2)); then, we determined whether each university in T10 represented an outlier for the distribution of items in T100, considering three scaled median absolute deviations (MAD) away from the median as the threshold for outlier detection [63,64].

3. Methodology Application and Results

In this section the step-by-step application of the methodology presented in Section 2 is described together with the obtained results.

3.1. First Step

The first step of the methodology is to identify the set T10 and T100. For k = 10 the set T10 results to be the union of the top ten universities included in each of the selected rankings: QRWU, QSWUR and THEWUR (Table 1). Similarly, the T100 was determined, according to Equation (1).
Thus T10 = {California Institute of Technology (CALTECH), Columbia University, ETH Zurich—Swiss Federal Institute of Technology, Harvard University, Imperial College London, Massachusetts Institute of Technology (MIT), Princeton University, Stanford University, University College London (UCL), University of California, Berkeley, University of Cambridge, University of Chicago, University of Oxford, Yale University} included 14 universities.
The coherence between rankings, or their level of agreement, was evaluated by using Spearman’s correlation among the top 100 positions in each ranking. Spearman’s rank correlation coefficient or Spearman’s ρ is a nonparametric measure of rank correlation [65]. It is used to assess how well the relationship between two variables can be described using a monotonic function. The obtained Spearman’s correlations are: 0.89 between ARWU and QSWUR; 0.91 between ARWU and THEWUR; 0.95 between QSWUR and THEWUR. Therefore, all rankings are strongly correlated and thus they exhibit an underlying coherence.

3.2. Second Step

In the second step a set of specialized KT indicators for evaluating the world top universities from the KT point of view were identified. Starting from UMR2020, 5 indicators were selected:
(1)
Co-publications with industrial partners
(2)
Patents awarded (absolute numbers)
(3)
Patents awarded (size-normalized)
(4)
Industry co-patents
(5)
Publications cited in patents.
The indicators are expressed on percentage measurement scale except for (2) Patents awarded (absolute numbers) and (3) Patents awarded (size-normalized). Thus, they have been reported on a percentage measurement scale in order to be comparable with the others and thus allows to compute mathematical operation. Furthermore, a composite index given by the arithmetic mean of the five indicators of the UMR2020 dataset was introduced. It can be considered a global performance index of the KT (KT GPI) actions of the universities.
Table 2 shows, in descending order, how the universities in T10 perform. The best performer results to be the University of California, Berkeley (UCB). In general, for the first indicator, co-publication with industrial partners, the Massachusetts Institute of Technology (MIT) and the Imperial College London are the best performers. The indicator “Patents awarded as absolute numbers” reveals that University of California, Berkeley is the best performer followed by the Harvard University, while if the indicator “Patents awarded (size-normalized)” is considered, the top university is represented by California Institute of Technology (CALTECH) followed by MIT. For Industry co-patents, the ETH Zurich—Swiss Federal Institute of Technology is the best and then the University of Cambridge; the last indicator, “Publications cited in patents”, shows that MIT and Harvard University are the most important.
It is also interesting to investigate how the 14 universities in T10 perform when compared with other universities. According to Equation (1), it results that card (T100) = 151 and after having deleted all the entries that contain missing data or null values card (T100) = 123. For each of the universities in T100, the five UMR2020 indicators of Table 2 were calculated together with the composite indicator GPI KT. Finally, the list was sorted in descending order. It is interesting at this point to verify how the universities in T10 are ranked compared with those in T100 and if they are still among the best performing ones. The results obtained are surprising. Only 2 of the universities included in T10, namely Berkeley and Harvard, are placed in the first 14 positions while the remaining ones appear to drop a lot in the ranking (Table 3). Massachusetts Institute of Technology (MIT) appears in position 15. Table 4 instead shows the details of the five UMR2020 indicators and the GPI KT for the first 14 universities included in T100.
It is interesting to note how the Californian universities and, to a lesser extent, the Japanese and some Asian ones stand out in the KT even though they are not among the best if we consider the traditional global university rankings.
The radar plot in Figure 2 shows how the KT performances of the universities in T10 are significantly different from those of the best universities in T100. The best performer universities in KT have a recognized technological vocation.

3.3. Third Step

In the third step, the universities included in T100 are investigated in order to identify groups of similar universities, in terms of KT indicators, by using the hierarchical clustering. The application of the hierarchical clustering algorithm defined provide n.4 clusters, organized as described in Table 5.
In Figure 3 the dendrograms resulted by the analysis are reported.
Of the top 14 universities within the 3 identified clusters (not considering the fourth—the rest of the world) we find only 4/14 that is the 29%. Specifically: MIT is absent, such as ETH, Imperial college London, Cambridge University, Princeton University, University of Chicago, Columbia University, Yale University, University of Oxford, University College London. Very interesting the creation of a unique cluster formed by Stanford University and CALTEC (cluster n.3). The emergence of a cluster (n.1) consisting, to the extent of 71%, of California universities, together with Harvard University and the Korean KAIST, appears very characteristic and in part coherent with the result obtained in Section 3.2. The clustering results to be in line with the ranking made in Table 4, in agreement with the GPI KT. In fact, the top best performers are all clustered between cluster #1 and #2. The emergence of California universities in both analyses would lead to the assertion that the industrial, social, and economic environment itself can affect the third mission activities of these universities.

3.4. Fourth Step

By using the median absolute deviations (MAD) criterion defined in formula (2), the aim of the step four is to understand how the universities included in T10 perform if compared with the others in T100 for each of the 5 UMR2020 indicators.
Figure 4, Figure 5, Figure 6, Figure 7 and Figure 8 help show the difference between the MAD value of each KT indicator for the 14 top universities in T10 and the outlier threshold: a positive difference is indicated with a red bar and denotes that the corresponding university is classified as an outlier with respect to the distribution of the specific KT indicator of the T100 universities.
In Figure 4, with respect to the “Co-publication with industrial partners” indicator, 7 (bar in red) over 14 universities in T10 perform much better than the remaining ones. Figure 5, which refers to the indicator named “Patents awarded (absolute numbers)”, points out that only 3 universities outperform with respect to the remaining ones. Finally, Figure 8 show the result for the “Publications cited in patents” indicator and points out that only 4 universities in T10 outperform. The Figure 6 and Figure 7 show no outperformers universities.

4. Discussion

The results obtained are non-trivial and, in some cases, surprising. The clustering analysis in Section 3.3 suggests that the top 14 universities in the world (according to global university rankings) are not an isolated group compared with the others in terms of knowledge transfer and, more in general, third mission activities. With exception of Stanford University and CALTEC, which represent a separate cluster, and Harvard and Stanford present in the California cluster, all the other best universities do not “stand out” from the others, considering the available KT indicators.
The MAD based analysis carried out in Section 3.4 reveals that three indicators appear to be more suitable to measure the performance of universities in terms of KT activities: “Co-publications with industrial partners”, “Patents awarded (absolute numbers)”, “Publications cited in patents”. This is a very useful result since these aspects have been relatively under-investigated so far [66]. The relevance of this result needs to be interpreted in the context of similar findings presented in the scientific literature from KT perspective.
With respect to “Co-publications with industrial partners”, it can be observed that, in recent years, due to collaborations between academia and industry, which represent a great support in the research, development and innovation in the framework of the third mission [67] has grown more and more [68], taking different forms and returning different outputs that are sometimes difficult to measure [69].
Several indicators can be used to measure the outputs of the university-industry collaboration, such as patents/intellectual property rights, publications, and learning metrics [70]. Among these, the analysis of the co-publications with industrial partners is used as an explicit proxy for evaluating university-industry collaboration [69,71,72,73,74] and for studying the university’s entrepreneurial orientation [75]. The university–industry co-publications, which are jointly authored by university researchers and staff employed by business enterprises, can also help to solve the issue related to the insufficient publicly available information (number of research contracts, number, and type of joint projects with industry, number of licenses) [76].
Some authors demonstrated also that the university–industry co-publications have a significant positive influence on universities’ technology commercialization outputs in terms of patenting, spin-off formation, and technology licensing [77].
In the past it has been often thought that a publication carried out with the company is of less value than a publication carried out with academic partners; however, using a field-weighted citation impact metric around the globe, it has been demonstrated that publications made with a company have a higher citation impact (“The urban myth of less respect for collaborating with industry is busted”) [78].
However, co-publications with industrial partners, as well as other quantitative measures, are still a long way from being regarded as perfect measures of university-firm collaboration [76]. Considering only the publication count is not a reliable tool to estimate the success of university-industry cooperation [79] and not all research collaborations lead to co-publication [80].
Finally, co-publications with industrial partners offer an interesting new source of data for the evaluation of the collaboration between industry and academia, but it should be used for “domestic and international comparisons of research universities, only within non-evaluative multi-dimensional benchmarking frameworks rather than for university league tables” [72].
For what concerns “Patent awarded”, it has always been used as an indicator of invention [81], a valuable estimator of technology development [82], an indicator of KT outputs, or a signal of capability of exploitation and commercialization of research results [83]. As Eurostat reports: “a count of patents is one measure of a country’s inventive activity and also shows its capacity to exploit knowledge and translate it into potential economic gains” [84]. Although patents have grown over time and are a tool to create economic profit, there is still a disproportionately small number of real cases of technology transfers [85]. The universities that stand out for this indicator are generally small and private, such as Harvard and MIT, or very specialized and home of numerous Nobel Prize winners as Berkeley.
Lastly, concerning the “Publications cited in patents” indicator, some studies stated that they play a crucial role in establishing a link between science and technology [86] or, in other words, a representation of the knowledge flows between coded scientific knowledge (scientific papers) and coded technological knowledge [87]. The indicator “Publications cited in patents”, therefore, becomes an increasingly used statistical parameter [88] that can be interpreted as a measure of KT efficiency, i.e., the pipeline from research to exploitation to market.
Excluding Patent awarded (size-normalized) that do not allow the distinction between the 14 top universities and the others in the rankings, the other indicator, Industry co-patents, probably suggests that patenting in partnership between universities and businesses is not the most useful and efficient form of collaboration. Industry co-patents is an indicator quite controversial. There are several reasons for which patenting is not considered an optimal KT indicator: the poor understanding of the needs of the market by academics, their need for publication (publish or perish), very cumbersome academic procedures for patenting, unrealistic royalties, the company’s necessity to take high risk and large investments to bring the technology to the market [89]. Li et al. stated that the co-ownership has a negative impact on patent commercialization: industry-academia patents are less probable to be commercialized [90]. Cerulli et al. focalized their attention on the impact of academic patents on firm’s performance, and they stated that there is a positive impact on market power but a lower profitability [91]. Certainly, academic scientific research is beneficial to industry because allows it to enlarge their capability to explore and develop new solutions and technological fields [92]; however, there is still dramatically limited empirical evidence on the impact of academic patents on business performance [91].
Attempting to answer the research question “how do the world’s top universities, evaluated according to global university rankings, perform from knowledge transfer point of view?”, by analyzing the five knowledge transfer indicators obtained from UMR2020 dataset, and combining the results obtained by GPI KT calculation, the clustering analysis, and the MAD based analysis, it is possible to claim that:
-
According to the results in Section 3.2, only 4 universities in T10 are clearly present in the identified clusters: Stanford University, CALTEC, University of Berkeley (UCB) and Harvard University
-
As emerge from Section 3.4, only 2 university over 14 (MIT, Harvard) result to be stand out with respect to the remaining ones if all the three indicators, “co-publications with industrial partners”, “patent awarded (as absolute number)” and “publications cited in patents”, are jointly used;
-
By combining the two previous results, it is possible to state that only 5 over 14 universities in T10 (Berkeley, Stanford, MIT, Harvard, CALTEC) exhibit a high-level performance (they are included in the first 30 position over the 123) in KT if compared with the remaining universities in T100;
-
A joint reading of the results obtained in Section 3 does not return a coherent and clear interpretation. The third mission and the process of knowledge transfer that takes place toward the territory can manifest itself in a variety of forms that are difficult to quantify with simple indicators. The complexity and the many facets of KT are such that they cannot be reduced to a fewer significant elements and that probably the quantitative data collected need to be integrated with qualitative data and context data.
-
The most popular global university rankings probably fail to effectively photograph third mission activities. They certainly do the best in the case of the other university missions: teaching and research.
In the following, for better understanding the success factors and best practices for improving knowledge transfer performances, a brief description of the 5 cited universities, in terms of organization of technology transfer service, relationship with business, entrepreneurship programs and, more generally, third mission activities, is provided.
As a general consideration they seem to be able to connect research results and market, and to incentivize businesses and stakeholders to collaborate with them. In other words, as showed in Figure 9, they have development organization structures, methods, and approaches for survive in the so-called Valley of Death [93], which represents “the gap between where publicly available research funding stops and where private investment or commercial funding starts” [94]. They represent successful cases to be imitated, and an inspiring collection of best practices to be adopted.
Harvard University. At Harvard University, the Office of Technology Development (OTD) connects innovators with industry partners, giving support to researchers/innovators in the advancement of their research through corporate partnerships, collaborations, and accelerator programs; the Office helps with the protection of IP to create a clear path forward for commercial development, with business development strategies for licensing or for the creation of new companies. The Office also provides support for industry partners by offering a single point of entry for engaging with Harvard researchers, accelerators, technology licensing, and new ventures. In order to bridge development gaps, Harvard University offers Accelerator programs that combine funding strategies, technical support, and business expertise to help promising innovations make the leap from the lab to the commercial sphere.
Regarding entrepreneurship, OTD’s Entrepreneurs in Residence (EIRs) engage directly with Harvard research groups to help advance technologies toward the launch of a startup.
Finally, the section “Impact” reports the number of new startup companies with their impact on society in terms of education, health care, food and agriculture, energy, sustainability, high-tech goods, and much more.
Massachusetts Institute of Technology. The Massachusetts Institute of Technology (MIT), one of the most vibrant hubs of innovation and entrepreneurship on Earth, dedicates a large part of its web site to the topic “Innovation”. MIT’s TLO (Technology Licensing Office) is engaged in the cultivation of an inclusive environment of scientific and entrepreneurial excellence and bridges connections from MIT’s research community to industry and startups by strategically evaluating, protecting, and licensing technology.
A separate website is dedicated to MIT’s Industrial Liaison Program (ILP) that is “industry’s most comprehensive portal to MIT, enabling companies world-wide to harness MIT resources to address current challenges and to anticipate future needs”. Nowadays more than 800 of the world’s leading companies collaborate with MIT researchers and together bring knowledge to bear on the world’s great challenges.
MIT Corporate Relations, the organizational parent of the ILP at MIT, is dedicated to finding connections to MIT faculty, departments, labs, and centers.
Great emphasis is placed on entrepreneurship with Martin Trust Center for MIT Entrepreneurship that seeks to advance knowledge and educate students in innovation-driven entrepreneurship by providing proven frameworks, courses, programs, facilities, and mentorship, and with the Program “Entrepreneur in Residence (EIR)”, a centerpiece of the Trust Center, where accomplished business leaders advise students on the challenges and benefits of startup life. MIT Startup Exchange is a program of MIT Corporate Relations that actively promotes collaborations and partnerships between MIT-connected startups and industry, principally ILP members.
Stanford University. At Stanford University, the Office of Technology Licensing (OTL) receives invention disclosures from Stanford faculty, staff, and students and evaluates them for their commercial possibilities, and, when possible, they also license them to industry. The office supports researchers providing numerous guides, for instance the “Inventor’s Guide” or the “Researcher’s Guide to Working with Industry”.
Great relevance is given to the concept of IMPACT; in fact, every year, OTL drafts an annual report where the number of issued patents, executed technology licenses, formed startups, and the amount of license income generated are reported.
The office “University Corporate and Foundation Relations” is a central university office that helps to foster relationships between Stanford University, companies, and private professional foundations. For corporations, there are engagement opportunities to collaborate with Stanford University, to connect to and recruit students, and to get executive education.
In the framework of Professional Education, Stanford University has the “Innovation and Entrepreneurship (SI&E) Certificate Program” to learn innovation and entrepreneurship as practiced at Stanford and in the Silicon Valley, and the “Stanford Idea-to-Market (I2M) course” to learn tools, techniques, and real-world expertise to make a business idea a reality.
University of California, Berkeley. The Office of Intellectual Property and Industry Research Alliances (IPIRA) provides a “one-stop shop” for industry research partners to interact with the campus. IPIRA’s mission is to establish and maintain multifaceted relationships with private companies, and thereby enhance the research enterprise of the Berkeley campus. IPIRA has promulgated technology transfer that generated billions of dollars in revenue and has created IP policies to promote social impact. Noticeably, the “Socially Responsible Licensing Program” serves as the gold standard for universities in the public health space.
About entrepreneurship, UC Berkeley helps students, faculty, researchers, and other innovators access a deep, interconnected ecosystem of resources for educating entrepreneurs, commercializing research, and advancing startups. The office supports entrepreneurs looking for funding, legal services, start-up guidelines, connections with ventures, and so on. The section “Berkeley Startups” provides a partial list of companies born out of licensing UC Berkeley IP rights.
CALTECH
In the California Institute of Technology (CALTECH), the Office of Technology Transfer and Corporate Partnerships (OTTCP) has the mission to drive the transfer of scientific and engineering knowledge created by the researchers to “maximize societal impact by developing partnerships with industry through the creation of new ventures, collaborations with corporations, and transfer of IP while nurturing an entrepreneurial environment” (https://innovation.caltech.edu/ accessed on 20 September 2022). The Istitute’s homepage showcases the following sections: “Corporate Partnerships” for productive collaborations and long-term partnerships with industry partners, in order to accelerate progress towards shared goals; “New Venture Creation & Entrepreneurship” illustrating the support to the formation of startup companies based on Caltech and JPL technologies by maintaining close relationships in the entrepreneurial community; “Patents & Licensing” whose goal is to make the technology transfer process and working with industry as easy as possible.
In a section dedicated to start ups, the program ”Entrepreneurs in Residence” is illustrated, i.e., a path to Launching a Venture, the Caltech Funding Sources, the Entrepreneurship Resources and so on.
Section “Impact”, with pay-off “Pushing interdisciplinary boundaries in the service of discovery”, described the inventions made by Caltech researchers since its founding. In the same page, the “Impact Report” and the “CALTECH Impact” report the outsized impact on science, technology, and society; its numbers convey the extent of the impact generated by research innovations, commercialization activities, and overall output of invention disclosures, patents, licenses, and startup companies. OTTCP teams connect companies, industry leaders, and other financial partners to Caltech and to the Jet Propulsion Laboratory’s research communities and help identify the types of strategic partnerships and opportunities that can advance business and investment goals.

5. Conclusions

This paper proposes a four steps methodology for answering to the following research question: “how do the world top universities, evaluated according to global university rankings, perform from knowledge transfer point of view?” and, as direct effect, to point out the success factors and best practices that can be adopted for improving performances in knowledge transfer.
Starting from the top universities in the most important global universities ranking (QRWU, QSWUR and THEWUR) the paper analyzes how they perform from third mission, knowledge transfer, relationship with business and entrepreneurship perspective. In order to do this, a set of specific KT indicators, obtained from U-multirank 2020, were selected and used to evaluate the performances of the top universities. The comparison carried out shows that the global universities rankings are unable to properly evaluate the knowledge transfer and third mission activities. In other words, the top universities in global ranking are not always as performing from the point of view of the third mission. Only 3 of the top universities in the global rankings come out on top after being evaluated with the specific U-multirank 2020 knowledge transfer indicators. Among the 30 top universities in KT, only 5 of the top T10 universities are present.
The paper, in attempt to identify possible factors that characterize the best universities from the perspective of the KT, investigates the sample through a hierarchical clustering algorithm. The results obtained do not show a particular relationship between the clusters obtained and the top universities of global university rankings. On the other hand, the composition of the clusters is interesting. In some cases, it appears to be based on basis of geographical position (such as the presence of numerous Californian universities in the same cluster), thus suggesting that there are contextual factors that the purely quantitative analyzes used by global university rankings fail to grasp or bring out. Finally, the analysis based on the MAD indicator helps to identify three indicators that best help assess the performance of universities in terms of KT activities: Co-publications with industrial partners, Patents awarded (absolute numbers), Publications cited in patents. The paper also tries to explain, through a targeted bibliographic analysis, why these indicators are interesting and useful for interpreting the performances in the KT.
The paper proposes a small new step forward in the scientific landscape of university assessment of technology transfer and third mission, an assessment that is not easy with the means and indicators that currently exist, as reported by other authors [29,66]. In fact, KT indicators are poorly considered in the most popular global university rankings (the QS World University Rankings, the Academic Ranking of World Universities, and The Times Higher Education) except for U-multirank [32].
Several attempts have been made to develop patterns of KT indicators at the European level [22,94]. Often the difficulty lies in being able to measure only what is measurable, i.e., many activities are not yet measurable and quantifiable such as the KT activities that occur with unintentional mechanisms [24]. For example, learning, contacts, friendship, networks, impact, reputation, and publicity are “non-monetary currencies” [94]. In this regard, there is also the scientific debate about the measurement of the impact and benefits of the KT activities on society (e.g., the UK REF Impact Case Studies).
On the other hand, the challenge of evaluating universities from the third mission point of view is becoming increasingly relevant, since the topic of the impact of the third mission activities and research results on society and the territory is becoming more and more central, even in international policies. In general, governments and politicians are interested in evaluating outcomes of public investment in research, university actors want to demonstrate their contribution to society and different types of stakeholders [22], and industry uses university evaluation in order to assess and determine partnerships.

Author Contributions

Conceptualization: A.D.L., N.A., R.B.; Methodology: A.D.L., F.D.N., N.A., R.B.; Formal analysis: A.D.L., F.D.N.; Investigation: A.D.L., F.D.N.; Visualization: A.D.L., F.D.N., N.A., R.B.; Supervision: N.A., R.B.; Writing—original draft: A.D.L.; Writing—review & editing: A.D.L., A.A., L.B., M.L.R., A.M., E.P., S.T., N.A., R.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are openly available at the following webpages: https://www.shanghairanking.com/rankings/arwu/2020 (accessed on 20 September 2022). https://www.topuniversities.com/university-rankings/world-university-rankings/2020 (accessed on 20 September 2022). https://www.timeshighereducation.com/world-university-rankings/2020/world-rank ing#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats (accessed on 20 September 2022). https://www.umultirank.org/about/u-multirank/the-project/ (accessed on 20 September 2022).

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Secundo, G.; de Beer, C.; Schutte, C.S.L.; Passiante, G. Mobilising Intellectual Capital to Improve European Universities’ Competitiveness: The Technology Transfer Offices’ Role. J. Intellect. Cap. 2017, 18, 607–624. [Google Scholar] [CrossRef]
  2. Agasisti, T.; Barra, C.; Zotti, R. Research, Knowledge Transfer, and Innovation: The Effect of Italian Universities’ Efficiency on Local Economic Development 2006–2012. J. Reg. Sci. 2019, 59, 819–849. [Google Scholar] [CrossRef] [Green Version]
  3. Research & Innovation Valorisation chanNels and Tools—Publications Office of the EU. Available online: https://op.europa.eu/en/web/eu-law-and-publications/publication-detail/-/publication/f35fded6-bc0b-11ea-811c-01aa75ed71a1 (accessed on 29 October 2022).
  4. World Economic Forum. How Universities Can Become a Platform for Social Change. Available online: https://www.weforum.org/agenda/2019/06/universities-platform-social-change-tokyo/ (accessed on 29 October 2022).
  5. Trippl, M.; Sinozic, T.; Lawton Smith, H. The Role of Universities in Regional Development: Conceptual Models and Policy Institutions in the UK, Sweden and Austria. Eur. Plan. Stud. 2015, 23, 1722–1740. [Google Scholar] [CrossRef] [Green Version]
  6. Cesaroni, F.; Piccaluga, A. The Activities of University Knowledge Transfer Offices: Towards the Third Mission in Italy. J. Technol. Transf. 2016, 41, 753–777. [Google Scholar] [CrossRef]
  7. di Berardino, D.; Corsi, C. A Quality Evaluation Approach to Disclosing Third Mission Activities and Intellectual Capital in Italian Universities. J. Intellect. Cap. 2018, 19, 178–201. [Google Scholar] [CrossRef]
  8. Compagnucci, L.; Spigarelli, F. The Third Mission of the University: A Systematic Literature Review on Potentials and Constraints. Technol. Forecast. Soc. Change 2020, 161, 120284. [Google Scholar] [CrossRef]
  9. Abreu, M.; Demirel, P.; Grinevich, V.; Karataş-Özkan, M. Entrepreneurial Practices in Research-Intensive and Teaching-Led Universities. Small Bus. Econ. 2016, 47, 695–717. [Google Scholar] [CrossRef] [Green Version]
  10. Urdari, C.; Farcas, T.V.; Tiron-Tudor, A. Assessing the Legitimacy of HEIs’ Contributions to Society: The Perspective of International Rankings. Sustain. Account. Manag. Policy J. 2017, 8, 191–215. [Google Scholar] [CrossRef]
  11. Backs, S.; Günther, M.; Stummer, C. Stimulating Academic Patenting in a University Ecosystem: An Agent-Based Simulation Approach. J. Technol. Transf. 2019, 44, 434–461. [Google Scholar] [CrossRef]
  12. Zawdie, G. Knowledge Exchange and the Third Mission of Universities: Introduction: The Triple Helix and the Third Mission—Schumpeter Revisited. Ind. High. Educ. 2010, 24, 151–155. [Google Scholar] [CrossRef]
  13. Lee, J.J.; Vance, H.; Stensaker, B.; Ghosh, S. Global Rankings at a Local Cost? The Strategic Pursuit of Status and the Third Mission. Comp. Educ. 2020, 56, 236–256. [Google Scholar] [CrossRef]
  14. Montesinos, P.; Carot, J.M.; Martinez, J.M.; Mora, F. Third Mission Ranking for World Class Universities: Beyond Teaching and Research. High. Educ. Eur. 2008, 33, 259–271. [Google Scholar] [CrossRef]
  15. Kapetaniou, C.; Lee, S.H. A Framework for Assessing the Performance of Universities: The Case of Cyprus. Technol. Forecast. Soc. Change 2017, 123, 169–180. [Google Scholar] [CrossRef]
  16. Laredo, P. Revisiting the Third Mission of Universities: Toward a Renewed Categorization of University Activities? High. Educ. Policy 2007, 20, 441–456. [Google Scholar] [CrossRef]
  17. Mariani, G.; Carlesi, A.; Scarfò, A.A. Academic Spinoff as a Value Driver of Intellectual Capital: The Case of University of Pisa. J. Intellect. Cap. 2018, 19, 202–226. [Google Scholar] [CrossRef]
  18. Markuerkiaga, L.; Caiazza, R.; Igartua, J.I.; Errasti, N. Factors Fostering Students’ Spin-off Firm Formation: An Empirical Comparative Study of Universities from North and South Europe. J. Manag. Dev. 2021, 35, 814–846. [Google Scholar] [CrossRef]
  19. Giuri, P.; Munari, F.; Scandura, A.; Toschi, L. The Strategic Orientation of Universities in Knowledge Transfer Activities. Technol Forecast Soc. Change 2019, 138, 261–278. [Google Scholar] [CrossRef]
  20. Pausits, A. The Knowledge Society and Diversification of Higher Education: From the Social Contract to the Mission of Universities. In European Higher Education Area; Springer: Cham, Switzerland, 2015; pp. 267–284. [Google Scholar] [CrossRef] [Green Version]
  21. The Future of Higher Education Is Social Impact. Available online: https://ssir.org/articles/entry/the_future_of_higher_education_is_social_impact (accessed on 29 October 2022).
  22. Campbell, A.; Cavalade, C.; Haunold, C.; Karanikic, P.; Piccaluga, A. Knowledge Transfer Metrics—Towards a European-Wide Set of Harmonised Indicators; Publications Office of the European Union: Luxembourg, 2020. [Google Scholar] [CrossRef]
  23. Scanlan, J. A Capability Maturity Framework for Knowledge Transfer. Ind. High. Educ. 2018, 32, 235–244. [Google Scholar] [CrossRef]
  24. Azagra-Caro, J.M.; Barberá-Tomás, D.; Edwards-Schachter, M.; Tur, E.M. Dynamic Interactions between University-Industry Knowledge Transfer Channels: A Case Study of the Most Highly Cited Academic Patent. Res. Policy 2017, 46, 463–474. [Google Scholar] [CrossRef] [Green Version]
  25. Wynn, M.G. University-Industry Technology Transfer in the UK: Emerging Research and Opportunities; IGI Global: Hershey, PA, USA, 2018. [Google Scholar]
  26. Griliches, Z. The Search for R&D Spillovers. Scand. J. Econ. 1992, 94, S29. [Google Scholar] [CrossRef]
  27. Scarrà, D.; Piccaluga, A. The Impact of Technology Transfer and Knowledge Spillover from Big Science: A Literature Review. Technovation 2022, 116, 102165. [Google Scholar] [CrossRef]
  28. O’Reilly, N.M.; Robbins, P.; Scanlan, J. Dynamic Capabilities and the Entrepreneurial University: A Perspective on the Knowledge Transfer Capabilities of Universities. J. Small Bus. Entrep. 2019, 31, 243–263. [Google Scholar] [CrossRef]
  29. Olcay, G.A.; Bulu, M. Is Measuring the Knowledge Creation of Universities Possible?: A Review of University Rankings. Technol. Forecast Soc. Change 2017, 123, 153–160. [Google Scholar] [CrossRef] [Green Version]
  30. Landinez, L.; Kliewe, T.; Diriba, H. Entrepreneurial University Indicators in Global University Rankings. In Developing Engaged and Entrepreneurial Universities: Theories, Concepts and Empirical Findings; Springer: Singapore, 2019; pp. 57–85. [Google Scholar] [CrossRef]
  31. European Commission Press Release. New International University Ranking: Commission Welcomes Launch of U-Multirank. Available online: https://ec.europa.eu/commission/presscorner/detail/en/IP_14_548 (accessed on 29 October 2022).
  32. Dip, J.A. What Does U-Multirank Tell Us about Knowledge Transfer and Research? Scientometrics 2021, 126, 3011–3039. [Google Scholar] [CrossRef]
  33. Bellantuono, L.; Monaco, A.; Tangaro, S.; Amoroso, N.; Aquaro, V.; Bellotti, R. An equity-oriented rethink of global rankings with complex networks mapping development. Sci. Rep. 2020, 10, 18046. [Google Scholar] [CrossRef] [PubMed]
  34. Marhl, M.; Pausits, A. Third Mission Indicators for New Ranking Methodologies. Lifelong Educ. XXI Century 2013, 1, 89–101. [Google Scholar] [CrossRef]
  35. Ringel, L.; Espeland, W.; Sauder, M.; Werron, T. Worlds of Rankings. Res. Sociol. Organ. 2021, 74, 1–23. [Google Scholar] [CrossRef]
  36. Dill, D.D.; Soo, M. Academic Quality, League Tables, and Public Policy: A Cross-National Analysis of University Ranking Systems. Higher Educ. 2005, 49, 495–533. [Google Scholar] [CrossRef]
  37. Bougnol, M.L.; Dulá, J.H. Technical Pitfalls in University Rankings. High Educ. 2015, 69, 859–866. [Google Scholar] [CrossRef]
  38. Sauder, M.; Espeland, W.N. The Discipline of Rankings: Tight Coupling and Organizational Change. Am. Sociol. Rev. 2009, 74, 63–82. [Google Scholar] [CrossRef]
  39. Oțoiu, A.; Țițan, E. To what extent ict resources influence the learning experience? an inquiry using u-multirank data. INTED2021 Proc. 2021, 1, 40–45. [Google Scholar] [CrossRef]
  40. Johnes, J. University Rankings: What Do They Really Show? Scientometrics 2018, 115, 585–606. [Google Scholar] [CrossRef] [Green Version]
  41. Frondizi, R.; Fantauzzi, C.; Colasanti, N.; Fiorani, G. The Evaluation of Universities’ Third Mission and Intellectual Capital: Theoretical Analysis and Application to Italy. Sustainability 2019, 11, 3455. [Google Scholar] [CrossRef] [Green Version]
  42. Hazelkorn, E.; Loukkola, T.; Zhang, T. Rankings in Institutional Strategies and Processes: Impact or Illusion Illusion; European University Association: Brussels, Belgium, 2014. [Google Scholar]
  43. Bellantuono, L.; Monaco, A.; Amoroso, N.; Aquaro, V.; Bardoscia, M.; Loiotile, A.D.; Lombardi, A.; Tangaro, S.; Bellotti, R. Territorial Bias in University Rankings: A Complex Network Approach. Sci. Rep. 2022, 12, 4995. [Google Scholar] [CrossRef] [PubMed]
  44. Degli Esposti, M.; Vidotto, G. II Gruppo di Lavoro CRUI Sui Ranking Internazionali: Attività, Risultati e Prospettive 2017–2020; Fondazione CRUI: Roma, Italy, 2020; ISBN 978-88-96524-32-9. [Google Scholar]
  45. Salomaa, M.; Cinar, R.; Charles, D. Rankings and Regional Development: The Cause or the Symptom of Universities’ Insufficient Regional Contributions? High. Educ. Gov. Policy 2021, 2, 31–44. [Google Scholar]
  46. Rauhvargers, A. Global University Rankings and Their Impact—Report II; European University Association: Brussels, Belgium, 2013; pp. 21–23. [Google Scholar]
  47. Loukkola, T. Europe: Impact and Influence of Rankings in Higher Education. In Global Rankings and the Geopolitics of Higher Education; Routledge: London, UK, 2016; pp. 127–139. [Google Scholar] [CrossRef]
  48. Pusser, B.; Marginson, S. University Rankings in Critical Perspective. J. High. Educ. 2013, 84, 544–568. [Google Scholar] [CrossRef]
  49. Aguillo, I.F.; Bar-Ilan, J.; Levene, M.; Ortega, J.L. Comparing University Rankings. Scientometrics 2010, 85, 243–256. [Google Scholar] [CrossRef]
  50. Moed, H.F. A Critical Comparative Analysis of Five World University Rankings. Scientometrics 2017, 110, 967–990. [Google Scholar] [CrossRef] [Green Version]
  51. Academic Ranking of World Universities (ARWU). Available online: https://www.shanghairanking.com/rankings/arwu/2020 (accessed on 20 September 2021).
  52. QS World University Rankings® (QSWUR). Available online: https://www.topuniversities.com/university-rankings/world-university-rankings/2020 (accessed on 20 September 2021).
  53. Times Higher Education World University Rankings (THEWUR). Available online: https://www.timeshighereducation.com/world-university-rankings/2020/worldranking#!/page/0/length/25/sort_by/rank/sort_order/asc/cols/stats (accessed on 20 September 2021).
  54. Van Vught, F.; Ziegele, F. Design and Testing the Feasibility of a Multidimensional Global University Ranking Final Report; Consortium for Higher Education and Research Performance Assessment: Twente, The Netherlands, 2011. [Google Scholar]
  55. Prado, A. Performances of the Brazilian Universities in the “u-multirank” in the Period 2017–2020; SciELO Preprints: São Paulo, Brasil, 2021. [Google Scholar] [CrossRef]
  56. Decuypere, M.; Landri, P. Governing by Visual Shapes: University Rankings, Digital Education Platforms and Cosmologies of Higher Education. Crit. Stud. Educ. 2020, 62, 17–33. [Google Scholar] [CrossRef]
  57. Kaiser, F.; Zeeman, N. U-Multirank: Data Analytics and Scientometrics. In Research Analytics; Auerbach Publications: Boca Raton, FL, USA, 2017; pp. 185–220. [Google Scholar] [CrossRef]
  58. Westerheijden, D.F.; Federkeil, G. U-Multirank: A European multidimensional transparency tool in higher education. Int. High. Educ. 2018, 4, 77–96. [Google Scholar]
  59. U-multirank. Available online: https://www.umultirank.org/about/u-multirank/the-project/ (accessed on 20 September 2022).
  60. Roux, M. A Comparative Study of Divisive Hierarchical Clustering Algorithms. Arxiv 2015, arXiv:1506.08977. [Google Scholar] [CrossRef] [Green Version]
  61. Ahmed, M.; Seraj, R.; Islam, S.M.S. The K-Means Algorithm: A Comprehensive Survey and Performance Evaluation. Electronics 2020, 9, 1295. [Google Scholar] [CrossRef]
  62. Wiley. Robust Statistics, 2nd Edition. Available online: https://www.wiley.com/en-us/Robust+Statistics%2C+2nd+Edition-p-9780470129906 (accessed on 29 October 2022).
  63. Leys, C.; Ley, C.; Klein, O.; Bernard, P.; Licata, L. Detecting Outliers: Do Not Use Standard Deviation around the Mean, Use Absolute Deviation around the Median. J. Exp. Soc. Psychol. 2013, 49, 764–766. [Google Scholar] [CrossRef] [Green Version]
  64. Simmons, J.P.; Nelson, L.D.; Simonsohn, U. False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychol. Sci. 2011, 22, 1359–1366. [Google Scholar] [CrossRef] [Green Version]
  65. Van de Wiel, M.A.; Bucchianico, A.D. Fast Computation of the Exact Null Distribution of Spearman’s ρ and Page’s L Statistic for Samples with and without Ties. J. Stat. Plan. Inference 2001, 92, 133–145. [Google Scholar] [CrossRef]
  66. Rossi, F.; Rosli, A. Indicators of University–Industry Knowledge Transfer Performance and Their Implications for Universities: Evidence from the United Kingdom. Stud. High. Educ. 2014, 40, 1970–1991. [Google Scholar] [CrossRef]
  67. Piirainen, K.A.; Andersen, A.D.; Andersen, P.D. Foresight and the Third Mission of Universities: The Case for Innovation System Foresight. Foresight 2016, 18, 24–40. [Google Scholar] [CrossRef]
  68. Djoundourian, S.; Shahin, W. Academia–Business Cooperation: A Strategic Plan for an Innovative Executive Education Program. Ind. High. Educ. 2022, 42, 09504222221083852. [Google Scholar] [CrossRef]
  69. Kohus, Z.; Baracskai, Z.; Czako, K. The Relationship between University-Industry Co-Publication Outputs. In Proceedings of the 20th International Scientific Conference on Economic and Social Development. Varazdin Development and Entrepreneurship Agency, Budapest, Hungary, 4–5 September 2020; pp. 109–122. [Google Scholar]
  70. Perkmann, M.; Neely, A.; Walsh, K. How Should Firms Evaluate Success in University–Industry Alliances? A Performance Measurement System. R&D Manag. 2011, 41, 202–216. [Google Scholar] [CrossRef]
  71. Tijssen, R. Joint Research Publications: A Performance Indicator of University-Industry Collaboration. Assess. Eval. High. Educ. 2011, 5, 19–40. [Google Scholar]
  72. Tijssen, R.J.W.; van Leeuwen, T.N.; van Wijk, E. Benchmarking University-Industry Research Cooperation Worldwide: Performance Measurements and Indicators Based on Co-Authorship Data for the World’s Largest Universities. Res. Eval. 2009, 18, 13–24. [Google Scholar] [CrossRef]
  73. Giunta, A.; Pericoli, F.M.; Pierucci, E. University–Industry Collaboration in the Biopharmaceuticals: The Italian Case. J. Technol. Transf. 2016, 41, 818–840. [Google Scholar] [CrossRef]
  74. Levy, R.; Roux, P.; Wolff, S. An Analysis of Science-Industry Collaborative Patterns in a Large European University. J. Technol. Transf. 2009, 34, 1–23. [Google Scholar] [CrossRef]
  75. Tijssen, R.J.W. Universities and Industrially Relevant Science: Towards Measurement Models and Indicators of Entrepreneurial Orientation. Res. Policy 2006, 35, 1569–1585. [Google Scholar] [CrossRef]
  76. Yegros-Yegros, A.; Azagra-Caro, J.M.; López-Ferrer, M.; Tijssen, R.J.W.; Yegros-Yegros, A.; Azagra-Caro, J.M.; López-Ferrer, M.; Tijssen, R.J.W. Do University–Industry Co-Publication Outputs Correspond with University Funding from Firms? Res. Eval. 2016, 25, 136–150. [Google Scholar] [CrossRef]
  77. Wong, P.K.; Singh, A. Do Co-Publications with Industry Lead to Higher Levels of University Technology Commercialization Activity? Scientometrics 2013, 97, 245–265. [Google Scholar] [CrossRef]
  78. University-Industry Collaboration: A Closer Look for Research Leaders. Available online: https://www.elsevier.com/research-intelligence/university-industry-collaboration (accessed on 29 October 2022).
  79. Seppo, M.; Lilles, A. Indicators Measuring University-Industry Cooperation. Est. Discuss. Econ. Policy 2012, 20, 204. [Google Scholar] [CrossRef]
  80. Katz, J.S.; Martin, B.R. What Is Research Collaboration? Res. Policy 1997, 26, 1–18. [Google Scholar] [CrossRef]
  81. Metrics for Knowledge Transfer from Public Research Organisations in Europe Report from the European Commission’s Expert Group on Knowledge Transfer Metrics. Available online: https://ec.europa.eu/info/research-and-innovation_en (accessed on 20 September 2022).
  82. Huang, Z.; Chen, H.; Yip, A.; Ng, G.; Guo, F.; Chen, Z.-K.; Roco, M.C.; Huang, Z.; Chen, H.; Yip, A.; et al. Longitudinal Patent Analysis for Nanoscale Science and Engineering: Country, Institution and Technology Field. JNR 2003, 5, 333–363. [Google Scholar] [CrossRef]
  83. Finne, H.; Day, A.; Piccaluga, A.; Spithoven, A.; Walter, P.; Wellen, D. A Composite Indicator for Knowledge Transfer Report from the European Commission’s Expert Group on Knowledge Transfer Indicators. Available online: https://op.europa.eu/en/publication-detail/-/publication/d3260d80-5e59-11ed-92ed-01aa75ed71a1 (accessed on 5 October 2022).
  84. Archive: Patent Statistics—Statistics Explained. Available online: https://ec.europa.eu/eurostat/statistics-explained/index.php?title=Archive:Patent_statistics&oldid=112826 (accessed on 29 October 2022).
  85. Choi, J.; Jang, D.; Jun, S.; Park, S. A Predictive Model of Technology Transfer Using Patent Analysis. Sustainability 2015, 7, 16175–16195. [Google Scholar] [CrossRef] [Green Version]
  86. Hammarfelt, B. Linking Science to Technology: The “Patent Paper Citation” and the Rise of Patentometrics in the 1980s. J. Doc. 2021, 77, 1413–1429. [Google Scholar] [CrossRef]
  87. Yamashita, Y. Exploring Characteristics of Patent-Paper Citations and Development of New Indicators; IntechOpen: London, UK, 2018. [Google Scholar] [CrossRef] [Green Version]
  88. OECD iLibrary. OECD Science, Technology and Industry Scoreboard 2015: Innovation for Growth and Society. Available online: https://www.oecd-ilibrary.org/science-and-technology/oecd-science-technology-and-industry-scoreboard-2015_sti_scoreboard-2015-en (accessed on 29 October 2022).
  89. Hamano, Y. University–Industry Collaboration; WIPO: Geneva, Switzerland, 2018. [Google Scholar]
  90. Li, L.; Chen, Q.; Jia, X.; Herrera-Viedma, E. Co-Patents’ Commercialization: Evidence from China. Econ. Resarch-Ekon. Istraživanja 2020, 34, 1709–1726. Available online: http://www.tandfonline.com/action/authorSubmission?journalCode=rero20&page=instructions (accessed on 20 September 2022). [CrossRef]
  91. Cerulli, G.; Marin, G.; Pierucci, E.; Potì, B. Do Company-Owned Academic Patents Influence Firm Performance? Evidence from the Italian Industry. J. Technol. Transf. 2022, 47, 242–269. [Google Scholar] [CrossRef]
  92. Peeters, H.; Callaert, J.; van Looy, B. Do Firms Profit from Involving Academics When Developing Technology? J. Technol. Transf. 2020, 45, 494–521. [Google Scholar] [CrossRef]
  93. Hudson, J.; Khazragui, H.F. Into the Valley of Death: Research to Innovation. Drug Discov. Today 2013, 18, 610–613. [Google Scholar] [CrossRef]
  94. Hockaday, T. University Technology Transfer: What It Is and How to Do It; Johns Hopkins University Press: Baltimore, MD, USA, 2020; p. 340. [Google Scholar]
Figure 1. Flow chart.
Figure 1. Flow chart.
Sustainability 14 15427 g001
Figure 2. Radar plot for five UMR knowledge transfer indicators for (a) the 14 universities in T10 (b) the top performer in T100 according to GPI KT.
Figure 2. Radar plot for five UMR knowledge transfer indicators for (a) the 14 universities in T10 (b) the top performer in T100 according to GPI KT.
Sustainability 14 15427 g002
Figure 3. The dendrograms resulted by the hierarchical clustering on T100. Each cluster is represented by a different color.
Figure 3. The dendrograms resulted by the hierarchical clustering on T100. Each cluster is represented by a different color.
Sustainability 14 15427 g003
Figure 4. Difference between the value of the indicator “Co-publication with industrial partners” for the 14 top universities in T10 and the outlier threshold.
Figure 4. Difference between the value of the indicator “Co-publication with industrial partners” for the 14 top universities in T10 and the outlier threshold.
Sustainability 14 15427 g004
Figure 5. Difference between the value of the indicator “Patents awarded (absolute numbers)” for the 14 top universities in T10 and the outlier threshold.
Figure 5. Difference between the value of the indicator “Patents awarded (absolute numbers)” for the 14 top universities in T10 and the outlier threshold.
Sustainability 14 15427 g005
Figure 6. Difference between the value of the indicator “Patents awarded (size normalized)” for the 14 top universities in T10 and the outlier threshold.
Figure 6. Difference between the value of the indicator “Patents awarded (size normalized)” for the 14 top universities in T10 and the outlier threshold.
Sustainability 14 15427 g006
Figure 7. Difference between the value of the indicator “Industry co-patents” for the 14 top universities in T10 and the outlier threshold.
Figure 7. Difference between the value of the indicator “Industry co-patents” for the 14 top universities in T10 and the outlier threshold.
Sustainability 14 15427 g007
Figure 8. Difference between the value of the indicator “Publications cited in patents” for the 14 top universities in T10 and the outlier threshold.
Figure 8. Difference between the value of the indicator “Publications cited in patents” for the 14 top universities in T10 and the outlier threshold.
Sustainability 14 15427 g008
Figure 9. The steps on a bridge in order to cross the Valley of Death: the research towards the market (blue steps) and the market towards the research (yellow steps).
Figure 9. The steps on a bridge in order to cross the Valley of Death: the research towards the market (blue steps) and the market towards the research (yellow steps).
Sustainability 14 15427 g009
Table 1. The University Ranking: World Top Ten Universities.
Table 1. The University Ranking: World Top Ten Universities.
ARWUQSWURTHEWUR
RankUniversityCountryUniversityCountryUniversityCountry
1Harvard UniversityUSMassachusetts Institute of Technology (MIT)USUniversity of OxfordUK
2Stanford UniversityUSStanford UniversityUSCalifornia Institute of Technology (CALTECH)US
3University of CambridgeUKHarvard UniversityUSUniversity of CambridgeUK
4Massachusetts Institute of Technology (MIT)USUniversity of OxfordUKStanford UniversityUS
5University of California, BerkeleyUSCalifornia Institute of Technology (CALTECH)USMassachusetts Institute of Technology (MIT)US
6Princeton UniversityUSETH Zurich—Swiss Federal Institute of TechnologyCHPrinceton UniversityUS
7Columbia UniversityUSUniversity of CambridgeUKHarvard UniversityUS
8California Institute of Technology (CALTECH)USUniversity College London (UCL)UKYale UniversityUS
9University of OxfordUKImperial College LondonUKUniversity of ChicagoUS
10University of ChicagoUSUniversity of ChicagoUSImperial College LondonUK
Table 2. Summary of UMR2020 KT indicators for T10 Universities.
Table 2. Summary of UMR2020 KT indicators for T10 Universities.
Co-Publications with
Industrial Partners
Patents Awarded (Absolute Number)Patents Awarded
(Size-Normalized)
Industry
Co-Patents
Publications Cited in PatentsGPI KT
University of California, Berkeley (UCB)7.10%98.94%2.00%10.02%2.30%24.07%
Harvard University8.00%66.12%1.45%8.26%3.30%17.43%
Massachusetts Institute of Technology (MIT)10.50%57.72%4.04%5.86%4.90%16.60%
Standford University9.40%40.63%1.68%11.13%2.80%13.13%
California Institute of Technology (CALTEC)7.60%29.42%10.88%7.33%2.10%11.47%
ETH Zurich—Swiss Federal Institute of Technology8.70%6.94%0.29%38.91%2.00%11.37%
University of Cambridge8.20%4.86%0.22%25.59%1.70%8.11%
University of Chicago6.80%4.89%0.24%20.76%2.00%6.94%
Columbia University7.80%19.97%0.54%4.23%1.70%6.85%
University College London (UCL)7.40%8.30%0.18%14.88%1.50%6.45%
Yale University6.70%8.20%0.51%11.97%2.10%5.90%
University of Oxford7.10%6.38%0.22%12.75%1.80%5.65%
Imperial College London10.20%4.17%0.20%10.95%1.90%5.48%
Princeton University7.30%5.88%0.61%11.11%1.20%5.22%
Table 3. Performance of universities in T10 according to GPI KT.
Table 3. Performance of universities in T10 according to GPI KT.
Position in T10Position in T100 according to GPI KT
University of California, Berkeley (UCB)16
Harvard University214
Massachusetts Institute of Technology (MIT)315
Stanford University426
California Institute of Technology (CALTEC)530
ETH Zurich—Swiss Federal Institute of Technology631
University of Cambridge745
University of Chicago862
Columbia University965
University College London (UCL)1071
Yale University1179
University of Oxford1284
Imperial College London1388
Princeton University1493
Table 4. Best performer universities in T100 according to GPI KT.
Table 4. Best performer universities in T100 according to GPI KT.
Co-Publications with
Industrial Partners
Patents Awarded (Absolute Number)Patents Awarded
(Size-Normalized)
Industry
Co-Patents
Publications Cited in PatentsGPI KT
Tsinghua University5.60%42.18%29.09%82.28%1.20%32.07%
University of California, San Diego (UCSD)10.60%98.98%2.34%10.02%2.70%24.93%
University of California, Santa Barbara8.20%98.94%3.30%10.02%2.30%24.55%
University of California, Los Angeles (UCLA)8.00%100.00%1.85%10.15%2.50%24.50%
Boston University7.90%4.46%100.00%6.85%1.80%24.20%
University of California, Berkeley (UCB)7.10%98.94%2.00%10.02%2.30%24.07%
University of California, Davis6.70%98.94%2.19%10.02%1.70%23.91%
The University of Tokyo8.30%19.11%19.36%70.63%1.90%23.86%
Seoul National University8.10%28.13%28.27%46.52%1.80%22.56%
Weizmann Institute of Science4.90%6.74%89.68%5.12%3.50%21.99%
Kyoto University8.80%12.93%17.01%57.71%1.80%19.65%
Tokyo Institute of Technology (Tokyo Tech)11.10%7.24%3.62%72.17%1.70%19.17%
Tohoku University10.20%13.98%0.69%68.89%1.60%19.07%
Harvard University8.00%66.12%1.45%8.26%3.30%17.43%
Table 5. The clusters obtained on T100.
Table 5. The clusters obtained on T100.
N. clusterUniversities Contained into the Cluster
1University of California, Santa Barbara
University of California, Los Angeles (UCLA)
University of California, Berkeley (UCB)
University of California, Davis
University of California, San Diego (UCSD)
Harvard University
KAIST—Korea Advanced Institute of Science &Technology
2KU Leuven
University of Toronto
Boston University
Weizmann Institute of Science
The University of Queensland
Université Grenoble Alpes
The University of New South Wales (UNSW Sydney)
The University of Melbourne
University of Copenhagen
Aarhus University
The Hong Kong University of Science and Technology
Sungkyunkwan University (SKKU)
The University of Tokyo
Korea University
Kyoto University
Seoul National University
Tsinghua University
3Stanford University
California Institute of Technology (CALTEC)
4Universities in the rest of the world
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Demarinis Loiotile, A.; De Nicolò, F.; Agrimi, A.; Bellantuono, L.; La Rocca, M.; Monaco, A.; Pantaleo, E.; Tangaro, S.; Amoroso, N.; Bellotti, R. Best Practices in Knowledge Transfer: Insights from Top Universities. Sustainability 2022, 14, 15427. https://doi.org/10.3390/su142215427

AMA Style

Demarinis Loiotile A, De Nicolò F, Agrimi A, Bellantuono L, La Rocca M, Monaco A, Pantaleo E, Tangaro S, Amoroso N, Bellotti R. Best Practices in Knowledge Transfer: Insights from Top Universities. Sustainability. 2022; 14(22):15427. https://doi.org/10.3390/su142215427

Chicago/Turabian Style

Demarinis Loiotile, Annamaria, Francesco De Nicolò, Adriana Agrimi, Loredana Bellantuono, Marianna La Rocca, Alfonso Monaco, Ester Pantaleo, Sabina Tangaro, Nicola Amoroso, and Roberto Bellotti. 2022. "Best Practices in Knowledge Transfer: Insights from Top Universities" Sustainability 14, no. 22: 15427. https://doi.org/10.3390/su142215427

APA Style

Demarinis Loiotile, A., De Nicolò, F., Agrimi, A., Bellantuono, L., La Rocca, M., Monaco, A., Pantaleo, E., Tangaro, S., Amoroso, N., & Bellotti, R. (2022). Best Practices in Knowledge Transfer: Insights from Top Universities. Sustainability, 14(22), 15427. https://doi.org/10.3390/su142215427

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop