Next Article in Journal
Practices Promoting the Inclusion of Adult Students with Disabilities in the Classroom: A Case of a Technical Vocational Education and Training College in Kazakhstan
Next Article in Special Issue
Updating Calculus Teaching with AI: A Classroom Experience
Previous Article in Journal
Factors Affecting Autistic Students’ School Motivation
Previous Article in Special Issue
Assessment of Digital Teaching Competence in Non-University Education
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Supporting Sustainable and User-Oriented Educational Technology Innovation with the University Innovation Canvas

by
Mia Bangerl
1,2,*,
Sebastian Dennerlein
3,
Katharina Maitz
4,
Marie Nitschke
1,
Martin Ebner
1 and
Viktoria Pammer-Schindler
1,2
1
Institute of Interactive Systems and Data Science, Graz University of Technology, 8010 Graz, Austria
2
Know-Center GmbH, 8010 Graz, Austria
3
Faculty of Behavioural, Management and Social Sciences, University of Twente, 7522 NJ Enschede, The Netherlands
4
Department of Educational Sciences, Private University College of Teacher Education Augustinum, 8010 Graz, Austria
*
Author to whom correspondence should be addressed.
Educ. Sci. 2024, 14(5), 528; https://doi.org/10.3390/educsci14050528
Submission received: 15 April 2024 / Revised: 2 May 2024 / Accepted: 3 May 2024 / Published: 13 May 2024
(This article belongs to the Special Issue Digital Learning Innovation)

Abstract

:
Innovating higher education teaching and learning is challenging due to structural, cultural, and resource-related reasons, and research indicates that university innovation benefits from a bottom-up approach as well as strategic alignment with university objectives. In this paper, we investigate such bottom-up innovation processes within higher education as supported by a specific tool: the University Innovation Canvas (UIC). Adapted from the Business Model Canvas and Lean Canvas, the UIC is designed to promote educational technology innovation and foster alignment of the innovation process with strategic objectives of the university: namely, sustainability and user orientation. An evaluation of the UIC based on interview and questionnaire data shows that its usage differs between innovation teams (on paper vs. digital, individual vs. collaborative, co-located vs. remotely, and synchronous vs. asynchronous). UIC usability is linked with these differences and with teams’ experience in realizing innovations. Overall, the UIC is perceived to be useful by (particularly, less-experienced) innovation teams and is successful at supporting sustainable and user-oriented innovations, as 14/15 innovations are still in use after up to four years since completion. To maximize its potential, more effort needs to be devoted to improving understanding of the UIC and supporting different workflows of innovation teams in the future.

1. Introduction

Universities are critical drivers of innovation in society. To create knowledge and develop innovative science and technology, universities need to engage and invest in innovative solutions. However, there are substantial barriers to such innovation within higher education, such as a general lack of appreciation and motivation for university teaching [1,2] and difficulty in realizing an aligned strategy for innovation in decentralized higher education institutions [3,4].
In this paper, we investigate bottom-up innovation within a university, supported by a specific tool: the University Innovation Canvas (UIC). The UIC is designed to focus innovation on widely agreed-upon success factors, such as considering users, resources necessary for innovation, and sustainability. Based on evaluation data from five years of usage of the UIC, we examine how the tool has been used and perceived by university innovation project teams and to what degree it was perceived as useful from a strategic alignment perspective.
In the following pages, we revisit the related work on innovation for universities and the literature on business-oriented bottom-up innovation and innovation tools. We introduce our specific use case and the UIC and then proceed to describe our research questions and methods. Finally, we discuss our results with respect to the usage and usability of the UIC and reflect upon its role in guiding educational technology innovation in higher education.

2. Related Work

2.1. Innovation in Higher Education

When it comes to creating and developing innovation in society, universities fulfill a crucial role. Etzkowitz and Leydesdorff [5] used the term “triple helix” to describe a symbiotic relationship between universities, industry, and government in innovation building and development. According to their model, universities are supported by both (inter)national governments and industry to develop new theoretical and applied knowledge and to train highly educated future industry employees. The knowledge generated by universities is then taken up by industry, on the one hand, to develop profitable economic products, and by governments, on the other hand, to inform legislative decisions (e.g., economic regulation). Additionally, industry profits generate wealth, which is positive from a governmental stance and, in turn, generates new funds for university financing and innovation development [5,6].
However, in order to be able to contribute and connect to society in this manner, universities need to also consider innovation regarding their own processes [7,8,9]. This is clearly visible in terms of computing technology, which enables as well as forces changes in teaching [7] and research [8]—the two key value-adding processes of universities—and, of course, also administration [9].
In this article, we are mostly concerned with innovation in terms of educational technology within higher education, and we are partially concerned in terms of technology for research data management.
For innovation endeavors in the context of teaching and learning in higher education, Schneckenberg [4] describes several barriers that arise from structural, cultural, and resource-related issues.
Firstly, universities are bottom-heavy institutions; thus, the successful realization of new forms of (e.g., technology enhanced) learning typically relies on individual actors [4]. Due to this structural characteristic and the major distinctions between different academic disciplines and university faculties and institutes [3,10], any universally mandated educational changes would likely only result in inconsistent and ineffective changes. Secondly, Schneckenberg [4] points to the lack of motivation of many university lecturers to make changes in their teaching. Although many lecturers would like to modernize their teaching and incorporate elements of digital learning [11,12,13], there is a lack of recognition and valuation of excellent university teaching. Universities typically see teaching as secondary to research when it comes to recruiting and promoting academic staff, which results in a lack of incentive for lecturers to put extra effort into their teaching [1,2,14,15]. This lack of incentive is also reflected in a lack of funding opportunities or the provision of other necessary resources (e.g., didactic and technical expertise and costs for materials and electronic devices: see [12,16,17,18]). Nonetheless, Schneckenberg [4] also notices a clear desire in universities to prioritize teaching more notably in the future. As explicated above, universities rely on attracting good students to be competitive and to upkeep their critical role in the socioeconomic environment [4,6]. Thus, modernizing and innovating their teaching is not just desirable but essential.
In summary, bringing forward changes in university innovation is difficult and requires dedicated effort by the universities. Transforming the system of university teaching and learning (i.e., also impacting educational culture, appreciation of teaching, etc.) [19] requires both an investment of resources (funding, expertise, time, and appreciation) by the university and a strategic alignment of the innovation process. Two key strategic objectives for university innovations are sustainability and user orientation. Innovations should be sustainable, i.e., they should be used or make an impact for a substantial period of time [20,21]. Such sustainability can be supported, e.g., by integrating the innovation into the existing sociotechnical infrastructure by designing for easy usability and by relying on open-source technology. Further, by designing user-oriented innovation, i.e., designing (educational technology) innovations based on users’ (staff and students) problems and needs [22], innovations are likely to be accepted by users and can integrate more seamlessly into the real environment of use [23,24].
To develop such innovations, which fulfill the strategic objectives of sustainability and user orientation, a united and methodical approach is required. However, the field of higher education lacks specific methods or tools that can guide and support the innovation process. For this reason, we look towards bottom-up innovation in a business context, where such methods and tools have been created and applied successfully.

2.2. Bottom-Up Innovation

A frequent practice companies apply in innovation processes is the inclusion of company-internal and external stakeholders, or users, into the innovation process [25,26]. Developing products with users in a co-creative fashion can be beneficial to both sides. Companies gain knowledge, ideas, and feedback from stakeholders who bring a detailed understanding of the problem, the context of use, and the appropriate solution for said problem in the form of an innovation idea or plan. User innovation research has also shown innovations developed by or in collaboration with users to be very successful [25,27], i.e., because innovations are developed based on users’ specific needs and thus tend to be more sustainable. Users (company-internal and external) bring both intrinsic and extrinsic motivation for developing, realizing, and using the innovation [28]. The innovation usually promises a solution to a specific problem experienced by the users, which creates a strong incentive to participate in the innovation process. External users also benefit from the resources and expertise (e.g., economic and legal) provided by an established company, which allows larger-scale innovation development and applicability of innovation beyond a sometimes very small and highly specialized context of use [26].
This bottom-up, user-oriented approach towards innovation therefore relies on relevant knowledge or skills accessible to the organization, either from external users or from decentralized actors within the organization, as is the case for universities. However, bottom-up innovation developments also rely on some level of strategic alignment with the organization’s goals and strategies [29,30]. For example, a study by Hsiao [29] indicates that a lack of strategic alignment can result in overly diversified products with compromised product performance, and that a balance between bottom-up decentralized innovation and strategic alignment with central goals (such as sustainability and user orientation) yields the best product success.
In our work, we build on two established business model innovation tools to ensure this balance: the Business Model Canvas by Osterwalder and Pigneur [31] and the Lean Canvas by Maurya [32].
Both the Business Model Canvas and the Lean Canvas are frequently used to find and create a business model and to structure and guide the innovation process [33,34,35]. The Business Model Canvas is a very established visual tool for designing, structuring, implementing, and assessing (new) business models [31]. By visually representing the key components of a business model, it serves as a structuring tool, but it can also be used as a device for communication [36,37], collaboration [38], and creative innovation [39]. The Business Model Canvas is separated into nine visual tiles (e.g., Customer Relationships, Revenue Streams, etc.) that together provide a generic overview of how a business aims to create, deliver, and realize (often, monetize) value [40]. The Lean Canvas [32] is a distilled (lean) spin on the Business Model Canvas and is particularly focused on startups and early-stage innovation plans, which is reflected in the adapted version of key components that are, e.g., strategically more focused on identifying a customer problem and a corresponding market gap [32,41]. Both the Business Model Canvas and the Lean Canvas are established innovation tools and can be used to strategically guide, structure, and develop bottom-up innovation. However, their design is focused on a business context of usage, and they cannot be used seamlessly to guide innovation in higher education due to distinct differences: most notably in the financing models of a company vs. a university, but also in different goals, e.g., taking over markets vs. integrating into existing infrastructures. Adaptations of such tools can therefore be developed for non-commercial innovation, such as innovation in higher education, and have been used successfully before in such contexts (see, e.g., [42]).

2.3. Synthesis

Based on the reviewed literature, it is clear that universities need and want to innovate their teaching and learning. However, innovation in higher education can be challenging due to a multitude of structural, cultural, and resource-related reasons. The decentralized and highly diverse structure of most European universities calls for a bottom-up approach towards innovation—but lasting transformations also require a level of alignment with university goals and strategies. In this context, we focus on two strategic objectives: namely, the development of sustainable (lasting) and user-oriented innovations. While in a business context, there are ample tools, methods, and sources on how to guide innovation in alignment with such goals, the same cannot be claimed for higher education and educational technology. In this paper, we address this research gap by introducing the University Innovation Canvas (UIC) as a strategic tool for bottom-up innovation in higher education and reflect upon its value, usability, and usage based on five years of evaluation in a university innovation initiative.

3. University Innovation Canvas: Context of Use, Design Rationale, and Design

Our research was set in our university, Graz University of Technology, in the context of a university innovation initiative for digitalization that ran in the years 2019–2023. An emphasis of the here-presented research is on innovation activities regarding teaching (technology-enhanced teaching and learning), with some results also stemming from parallel and similarly structured innovation activities regarding research (research data management).
In the context of this initiative, university lecturers and researchers could apply for funded, small-scale digital innovation projects in so-called innovation project teams. An applicant team needed to consist of, at minimum, two persons (including at least one lecturer in the case of the educational technology projects). The innovation initiative was organized in three cycles, i.e., there were three application periods in total.
Each innovation project had to address one or multiple directly observed problems from the respective environments of applicants. The proposed customized digital solution could be developed (including an implementation and evaluation phase) over two to three semesters. To encourage and ensure each innovation’s contribution to the university’s strategic goals (sustainability and user orientation), the solution had to be implemented in at least one university course, with feedback collection from learners and/or lecturers as an obligatory stipulation for teaching-related (educational technology) innovation. For research-related (research data management) innovations, the solution had to be implemented within a research team or an institute—again, with feedback collection as an obligatory stipulation.
Further, the innovation initiative also organized overarching community-building events, such as Barcamps [43], during which project teams had the opportunity to connect, build interdisciplinary networks, and present and disseminate their work within our university.
The goal of our research, which is presented in this paper, was to develop tools that support the above-described bottom-up innovation process and to evaluate their usage, usability, and usefulness. Such tools should serve the innovation project teams during the development of innovative technology for teaching and learning or for research by (i) supporting documentation of core aspects of the innovation idea as well as reflection, (ii) supporting the university’s strategic goal of facilitating sustainable innovation beyond the seed-funding provided within the innovation program, and (iii) guiding user-oriented development as a means to achieve such sustainability [44].
For educational technology innovation, after reviewing the Business Model Canvas [31] and the Lean Canvas [32], it was decided that substantial adaptations would help achieve these goals, and the University Innovation Canvas (UIC) was created.
As displayed in Figure 1, the UIC is structured into three sections and eleven tiles, which correspond to different aspects of innovation design and development and the strategic goals of sustainability and user orientation. The first section, in blue and inspired by the Lean Canvas, encompasses the four tiles: Addressed Problem(s), Innovation Actions, Value Proposition, and Value Measure. These tiles are designed to give an overview of the motivation, the idea, and the realization steps of the planned innovation. By using the UIC, teams plan out the steps for resolving the observed problem and think about the desired outcome and how to measure success. The second section, in orange, encompasses two tiles that go beyond both the Business Model Canvas and the Lean Canvas: Co-Creation Plans and Stakeholder Groups. This section focuses on user-oriented innovation and development. Using the UIC, project teams think about all relevant stakeholders of their innovation and how to include their opinions and feedback in the innovation and implementation process. The third section, in green, encompasses five tiles and is inspired and adapted from the Business Model Canvas: Partner Institutions, Key Resources, Sustainability Plans and Ideas, Dissemination Channels, and Learning and Chances. The first two tiles are designed to foster strategic planning about relevant project partners and resources, while the latter three are concerned with planning for the long-term preservation and usage of the innovation, spreading awareness and broadening potential user groups, and thinking toward possible future developments. The final tile was specifically created for the UIC to account for the iterative use of the tool throughout a project’s runtime and is original to the UIC.
The UIC was given to the project teams in two versions: a concise version with graphical tiles and a longer, written document version of the UIC with headings and subsections. The graphical UIC (Figure 1) was designed to provide a quick and precise overview of the project and its current status, and the textual UIC was designed to serve as a more in-depth project description and report, where teams could elaborate. The UIC has been published by the innovation initiative at Graz University of Technology under a CC BY-SA 4.0 international license and is therefore freely available to other interested parties, e.g., other higher education institutions, for adoption and adaptation.
For research-related innovation (research data management, specifically), it was decided to use the Business Model Canvas, as it seemed to be applicable and understandable in informal first discussions with prospective innovation team members.

4. Research Questions

Subsequently, the present work focuses on an evaluation of the University Innovation Canvas (UIC) as a tool that specifically supports teaching-related digital innovation in higher education.
We ask the following research questions in this paper:
RQ1 
UIC Use: How was the UIC used throughout the five years and by different project teams?
RQ2 
UIC Usability: Was the UIC perceived as a usable tool by project teams?
RQ3 
UIC (Perceived) Usefulness: Was the UIC perceived as useful by the project teams, particularly from the perspectives of strategic alignment with the university’s goals (sustainability, user orientation)?
Note that, naturally, project teams had no direct insight into the alignment of their project with the university’s strategies and goals; thus, the answer to RQ3 also includes general outcomes of the initiative.

5. Materials and Methods

5.1. Procedure—Innovation Projects and UIC Evaluation

The evaluation of the University Innovation Canvas (UIC) ran partly in parallel with the innovation projects and was partly post-hoc. We, therefore, first describe how the innovation process was roughly organized and then clarify at which points evaluation activities regarding the UIC took place.
After successfully applying for funding for its innovation project, each innovation team was introduced to the UIC in a kick-off meeting. In this kick-off meeting, it was explained how the UIC can be used and filled out to support the innovation process. Each tile of the UIC was discussed and explained in detail, and eventually, questions were answered.
Within its project runtime, each innovation team participated in three (mandatory) meetings regarding the UIC to reflect on the current understanding of the innovation project in terms of the aspects covered by the UIC. The first meeting was always scheduled at the start of the project runtime. The second meeting was usually scheduled before the innovation was first implemented in a university course. The third and final meeting was scheduled around the end time of the project. These meetings were moderated by members of the UIC research team (experts from the innovation initiative), i.e., always by one of this paper’s authors. The UIC always had to be submitted in both graphical and textual form to these innovation experts before the meeting.
Within the meetings, the teams usually first presented their progress and brought up open questions, problems, successes, etc. Then, the project was discussed with regard to the UIC. The moderator also gave feedback on comprehensibility, completeness, and strategic orientation as visible in the UIC. Particular attention was also dedicated to potential contradictions, challenges, and future potential. The meetings were recorded, and a questionnaire on the UIC and the UIC meeting was filled out by the attending members of the project team after each meeting. Teams were also able to contact the innovation experts outside of the regularly scheduled UIC meetings if they had questions or needed support. This offer was taken up by several teams, most notably regarding the integration of users into the innovation process.
An overview of the typical innovation project procedure is given in Figure 2.

5.2. Data Collection

We answered our research questions by analyzing two types of collected data: a structured evaluation questionnaire and semi-structured qualitative interviews. Both instruments contained items regarding all research questions.
The “UIC Evaluation Questionnaire” was to be filled in by all participating members of each innovation team after each of the three mandatory meetings for discussing the innovation with respect to the University Innovation Canvas. The questionnaire contained questions about the team members’ experiences with using the UIC and participating in the UIC meetings. The questionnaire also asked about the perceived increase in innovation skills and knowledge. In the final cycle of the innovation initiative, an “Extended UIC Evaluation Questionnaire” was used to get more in-depth insight into the project teams’ uses of the UIC. The extended questionnaire included the same questions as before plus additional questions about the perceived usefulness of the individual tiles of the UIC and the perceived value of different steps of working with the UIC (e.g., filling it out as a team and discussing it with the innovation experts).
In interviews, again, questions were asked regarding use and opinions of the UIC and how it contributed to the outcome of the respective innovation project.

5.3. Sample and Participants

Overall, 15 innovation project teams were active between 2019 and 2023 in the area of educational technology innovation. These 15 teams consisted of 2–7 team members ( m = 3.60 , m d = 3 , s d = 1.45 , and n = 15 ) and typically included one or multiple experienced senior researchers as project owners and one or multiple researchers in the post-doc or Ph.D. phase as project managers. Student project employees were also frequently included in the project team, particularly to support the technical realization of projects. The 15 project teams covered a large spectrum of academic disciplines: all seven university departments (Architecture; Civil Engineering Sciences; Electrical and Information Engineering; Computer Science and Biomedical Engineering; Mechanical Engineering and Economic Sciences; Mathematics, Physics and Geodesy; Technical Chemistry, Chemical Process Engineering, and Biotechnology), and 20 different labs were represented among the teams. In total, 45 individuals participated in project teams, with 6 individuals working in multiple teams. The gender distribution among the 45 project team members was 75.56 % male and 24.44 % female, which corresponds almost exactly to the gender distribution among the university’s academic staff [45].
Of the 15 educational technology innovation project teams, 14 participated in the mandatory UIC meetings and filled in an evaluation questionnaire after each of these meetings. One team dropped out of the innovation program due to internal problems.
As there were three UIC meetings for each innovation project, the questionnaire was filled out three times by the team members of the 14 teams who participated in the meetings. An overview of all evaluation data is displayed in Table 1. Overall, the UIC evaluation questionnaire was, therefore, filled out 67 times and by 30 different project members. After the first UIC meetings, 25 answers were received, then 20 answers were received after the second UIC meetings, and 22 were received after the third UIC meetings. In the third innovation cycle, a more in-depth version of the questionnaire was filled out by the teams: in total 24 times and by 11 different project members. For this extended questionnaire, 10 answers were received after the first meeting, 6 were received after the second meeting, and 8 were received after the third meeting.
Interviews on UIC usage and perception were conducted with five people. These were members of five different innovation projects in the area of educational technology innovation. In addition, we also interviewed three members of innovation teams in the area of research (data management)-related projects who used the original Business Model Canvas as an innovation guidance tool instead of the UIC. We carried out these interviews to have some broader insight on canvases for bottom-up innovation in higher education and to be able to better connect results regarding the UIC to the broad literature available on canvases in innovation in business.
All interviews took place after the respective innovation project was concluded, and all interviewees participated voluntarily. Only team members who had a central role (e.g., project manager) throughout their innovation project’s runtime were invited, as they had used the respective tool most intensely. We only invited one team member per project and tried to recruit interviewees from different innovation cycles of the initiative to form a diverse pool of interviewees and obtain rich qualitative data. All interviews but one were held in German, as this was the native language of the interviewees. One non-native German interviewee preferred to be interviewed in English.

5.4. Data Analysis

The results from the questionnaire were statistically analyzed and compared regarding distributions and central tendencies of the data (mean, median, and standard deviation). As the questionnaires (both the original and the extended version) were filled out by project team members up to three times—once after each UIC meeting in which they participated—we analyzed the answers by grouping the data of the respective meetings. Thus, we formed three data groups for the original questionnaire, with the first including all answers received after the first UIC meetings ( n M 1 = 25 ), the second including all answers received after the second UIC meetings ( n M 2 = 20 ), and the third including all answers received after the third UIC meetings ( n M 3 = 22 ). This gave us three disjunct groups of relatively equal size. We repeated this method with the data of the extended questionnaire from the third innovation cycle, which was filled out 24 times in total ( n M 1 = 10 , n M 2 = 8 , and n M 3 = 6 ). (We refrained from using inferential statistical methods (e.g., testing for group differences) due to the inconsistent quality of the data, which would have drastically compromised the validity of any such results. In our three data groups, some project team members were represented in all three groups, and some were only in one or two groups; the groups had different sizes, and project teams had different numbers of team members; thus, teams were not equally represented in the data. Problems such as these often arise in field studies [46] and can limit the validity of results. In this study, we tried to validate our results by using a mixed-methods approach and prioritizing descriptive analyses.)
The interviews were transcribed and qualitatively analyzed via MAXQDA following Kuckartz’s [47] content structuring content analysis. Through iterative inductive and deductive coding, we developed a coding system that encompassed eight analytical categories and by which all interview transcripts were coded in a final iteration. Based on the coded data, we summarized and interpreted the results for this paper.

6. Results

6.1. Usage of the UIC

In this section, we analyze the results concerning the first research question (RQ1): How was the UIC used throughout the five years and by different project teams? We answer this research question with results from our eight qualitative interviews with the educational technology and research data management project team members. Results from the interviews show that usage of the University Innovation Canvas (UIC) varied greatly between project teams. Interviewees reported many different methods of filling out the UIC and working with it throughout their project period. From analysis, we derived four aspects of how the UIC was used, which can be broken down into 4 × 2 opposite characteristics. Different methods of working with the UIC corresponded to different benefits and problems, which will be discussed in the next section (Section 6.2).
The first aspect of usage concerns the usage of the UIC in a paper-based, physical form or as a digital tool. Some teams printed the UIC on a large paper poster and used post-its to fill the different tiles with content. One team also reported that they printed the UIC, but only in regular A4 format, which they later transferred to a digital version. Many teams also used the UIC as a digital form, filling it in via PowerPoint or accessing online-collaboration tools such as Mural or Miro.
We printed it out on A0 or A1 paper, put it on a whiteboard, and then filled it with post-its together. That was the most valuable thing for us, to think about it together for an hour, just like a brainstorming session, to be forced to live through the whole thing, even with pauses and more and more additions. We then photographed it. That was actually the essential thing for us—the graphic work!
(Interview 5, educational technology project member)
The second aspect concerns the usage of the UIC as a team. Most teams chose to fill in the UIC jointly as a group by either dividing the task or performing it as part of a joint effort. However, a small number of project teams, particularly teams with a small number of team members, transferred the responsibility of filling in the UIC to one person only.
The third aspect concerns the collaborative usage of the UIC in physical co-presence or remote collaboration. This aspect is strongly connected with the first aspect, as teams who used a paper-based UIC always collaborated in physical co-presence, while teams who used a digital UIC collaborated either remotely or in physical co-presence. Collaborating remotely was often relevant for larger, more interdisciplinary teams who did not share an office tract and had trouble coordinating regular meetings on-site.
The fourth aspect concerns the collaborative usage of the UIC in synchrony or in asynchronous iterations. While the other aspects stayed mostly consistent throughout the project runtime, the form of collaboration—if a team worked collaboratively—often varied over time. Several teams prioritized having one large, synchronous meeting when they first filled in the UIC, but then they only updated the UIC in iterations and whenever necessary (e.g., before the UIC meetings). Interviewees from such teams reported that they invested a lot of time and joint planning effort into filling in the UIC for the first time, so they then did not need many updates throughout the rest of the project runtime and only updated the UIC asynchronously. Similarly, many teams filled in the graphical UIC together in synchrony but then elaborated their project in the textual UIC in asynchrony.
P: [M]ostly we filled it out in the core team, so [colleague], myself, and our first study assistant at the time, […], and we filled it in online together.
I: So you did it more or less synchronized?
P: More or less. It was more me during the updates.
I: Yes, so synchronized at the beginning, and then in iterations?
P: Yes.
(Interview 3, educational technology project member)
Overall, there was a lot of variation in teams’ methods of working with the UIC regarding these differences. Contrarily, teams were very similar regarding the contents of the UICs, in the sense that all teams accepted the tile-based structured system of the UIC and were able to make sense of the corresponding dimensions of university innovation development. All teams filled in context-appropriate and project-relevant contents and stayed concise in the graphical UIC while elaborating in greater detail in the textual UIC. In this sense, while the method of working with the UIC varied, the outcome of this work was still relevant, appropriate, and productive to the project development.

6.2. Usability of the UIC

In this section, we analyze the results concerning the second research question (RQ2): Was the UIC perceived as a usable tool by project teams? We answer this research question with results from our qualitative interviews and the UIC evaluation questionnaires. Results from our interviews show that the UIC was initially hard to understand for many project teams, particularly those with less combined knowledge of project management and innovation development. Interviewees reported confusion about the meanings of certain tiles (e.g., sustainability) when first learning about the UIC. In addition, they often had not thought about aspects such as sustainability or dissemination at the beginning of their project. But for these reasons, the perceived usefulness of the UIC as a structuring and alignment tool was also the highest for project teams without much prior experience in project management and realizing innovations (see Section 6.3).
I: And when the UIC was first introduced, what did you think about it? Was it too much, or unclear how to use it, or…?
P: Yes, completely unclear. I didn’t have anything to do with project management before and not much now. Because of this, I was sceptical about it. Now, I find it quite good to structure these points, even if I don’t use it yet [in other projects].
(Interview 1, educational technology project member)
In contrast, project team members with a lot of experience were able to quickly understand and work with the UIC without excessive effort. Interviewees from such project teams had typically already thought through their project in great detail and could easily transfer that knowledge and planning to the UIC. In this sense, the UIC was perceived more as a reporting tool and was perceived to be less useful because the work of thinking about and planning key aspects of the project had already taken place beforehand.
In the extended final questionnaire, educational technology team members were asked about their experience in realizing innovation projects on a scale from 1 (least) to 5 (most). Answers show that the mean experience was relatively low after the first UIC meeting ( m M 1 = 2.00 , m d M 1 = 2 , s d M 1 = 1.25 , n M 1 = 10 ) but rose throughout the project ( m M 2 = 2.20 , m d M 2 = 2 , s d M 2 = 1.30 , n M 2 = 6 | m M 3 = 2.88 , m d M 3 = 3 , s d M 3 = 1.55 , n M 3 = 8 ). Similarly, agreement to the statement “My knowledge and skills in bringing forward sustainable innovation increased as a result of the UIC meeting.” also increased after each meeting, as displayed in Figure 3.
These results are consistent with what our interviewees reported—that most project teams had little experience in innovation project planning and managing: thus, the UIC was also difficult to understand at first for most teams.
This struggle of not properly understanding the specific tiles and the method of using the UIC was amplified for the research data management teams who used the Business Model Canvas. The interviewees initially found the Business Model Canvas confusing, as it contained fields (e.g., cost structure) that presupposed a commercial and profit-orientated use of the innovation. Though they all eventually learned to use the tool and fill in all its tiles, this required intense and sometimes repeated discussions with the innovation experts, and even after the conclusion of their projects, they all found the Business Model Canvas to be unfitting for projects in a university context.
The first [filling in] was a challenge, definitely, and the biggest part of the challenge was that it was difficult to translate what was, let me put it this way, a university pilot project into a business plan, a business case, and there were things in this Business Model Canvas that I simply didn’t need in the project. Things that were also not necessary.
(Interview 4, research data management project member)
Finally, because of the multitude of different methods of working with the UIC (see Section 6.1), there were also connected issues of usability: Teams who collaborated remotely would have liked a digitized UIC with collaborative functions (e.g., comments and track changes), teams who filled in the UIC but printed it in A4 format needed more space for writing, and teams who printed the UIC in a larger format (e.g., A0) asked for a higher-resolution UIC. In the extended questionnaire, teams were asked an open question regarding what would have improved working with the UIC. Less frequently mentioned improvements for working with the UIC included a “better design” of the UIC, more preparation by the project team, and more focus on the project goals. The grouped results are displayed in Figure 4 and correspond to the different workflows presented in Section 6.1.

6.3. Perceived Usefulness of the UIC

In this section, we analyze the results concerning the third research question (RQ3): Was the UIC perceived as useful by the project teams, particularly from the perspectives of strategic alignment with the university’s goals (sustainability, user orientation)? We answer this research question with results from our qualitative interviews and the UIC evaluation questionnaires. As presented in the previous section (Section 6.2), the UIC was generally perceived as useful by the education technology project teams: in particular, for teams with less experience and competencies in (innovation) project planning and management. The results from both questionnaires suggest that the perceived usefulness of the UIC increases over time:
In the main questionnaire, teams were asked to express their agreement on a scale from 1 (lowest) to 5 (highest) points with the statement “Explicating the educational technology innovation in the UIC has helped me to spot shortcomings or gaps in the innovation process.” The mean points given in reply to this question increased after each UIC meeting, with a mean of m M 1 = 3.80 ( m d M 1 = 4 , s d M 1 = 0.88 , n M 1 M 1 = 25 ) after the first meeting, a slight increase to a mean of m M 2 = 3.85 ( m d M 2 = 4 , s d M 2 = 0.92 , n M 2 = 20 ) after the second meeting, and a further increase to a mean of m M 3 = 4.14 ( m d M 3 = 4 , s d M 3 = 0.81 , n M 3 = 22 ) after the third meeting. Similarly, mean agreement points (on the same scale) for the statement “My knowledge and skills in bringing forward sustainable innovation increased as a result of the UIC meeting.” also increased consistently, from m M 1 = 3.64 ( m d M 1 = 4 , s d M 1 = 1.03 , n M 1 = 25 ) after the first meeting, m M 2 = 3.85 ( m d M 2 = 4 , s d M 2 = 0.70 , n M 2 = 20 ) after the second meeting, and m M 3 = 4.00 ( m d M 3 = 4 , s d M 3 = 1.14 , n M 3 = 22 ) after the third UIC meeting.
In the extended questionnaire, we also asked after each UIC meeting which step of working with the UIC team members had found helpful in the innovation process. The three steps perceived to be most helpful are displayed in Figure 5. As shown in the graph, team members found it most helpful to outline the project aspects in the graphical UIC, to discuss the UIC as the project team, and to discuss the UIC in the meetings with the innovation experts. While the latter two steps were ranked high from the beginning, the filling-out of the graphical UIC was only found to be helpful by 30.0 % (after the first UIC meeting) and 33.0 % (after the second UIC meeting) of educational technology team members. However, by the third meeting, this percentage had increased to 75.0 % .
For each tile of the UIC, we also asked team members if discussing this tile was helpful in the UIC meeting. After the first meeting, Innovation Actions was rated most helpful (helpful for 80.0 % of team members, n M 1 = 10 ), after the second meeting, Addressed Problems and Sustainability Plans and Ideas were rated most helpful (helpful for 50 % of team members, n M 2 = 6 ), and after the third meeting, Value Measures and Sustainability Plans and Ideas were rated most helpful (helpful for 100 % of team members, n M 3 = 8 ). Generally, all tiles were found to be helpful by some team members at some point. (With one exception: The tile Partner Institutes was never found to be helpful by anyone after any meeting. Though project partners were discussed in the UIC meetings, they were often included in discussions regarding either Stakeholder Groups or Key Resources, which might explain this result). Perceived helpfulness of the tiles increased over time, and after the third meeting, the perceived helpfulness of all tiles (except one) was higher than in any meeting prior. This again underlines the increased perceived usefulness of the UIC over time. This increase was also reflected in the interviews. Interviewees often reported initial skepticism regarding the UIC, but then, they explained that, over time, they understood the UIC more and found it more useful over time for its structuring and guiding functions but also for inspiring discussion and reflection about some aspects of innovation development that the teams would not have thought about on their own. Several interviewees even stated that they would like to use the UIC again for different projects in the future.
[T]he tiles helped me to find a common thread again because I wasn’t working on the project all the time, I was doing lots of other things. And when I had it in front of me, when I opened it up and saw it, the information we wrote on it, the way we filled it in, it helped me to get back into [the project] relatively quickly.
(Interview 6, educational technology project member)
The most positive thing was that you think about certain things a little more carefully than you might otherwise do. You define requirements, addressed problems are clear anyway, yes, but about the other things, about sustainability aspects, benefits for partners and stakeholders, and so on. So you might otherwise think less about that, including the sustainable aspects, how do I manage to keep it going after the end of the project at the [university].
(Interview 3, educational technology project member)
Likewise, in the extended questionnaire, we also asked project team members what they found positive about the UIC in an open question. Figure 6 displays the grouped responses to this question. Most commonly, the UIC was found to provide a good overview and inspire reflection and understanding about different aspects of realizing innovations. Additionally, the team members valued the UIC for providing structure for process planning and realization and for its function as a communication and team alignment tool. Other replies commended the specific and precise tiles of the UIC and noted that work with the UIC was productive.
Altogether, the results relevant to RQ3 suggest clearly that the UIC was found to be useful in the innovation process, and that this perception increased throughout the innovation projects’ runtimes. The results of RQ3 are also supported by the outcome of the initiative: after completion of all 15 educational technology innovation projects, with the first projects formally completed in 2020 and the final projects completed in 2023, 14 out of 15 projects are still in use at one or multiple courses at the university.

7. Discussion

We emphasize that this discussion is set in the context of a university innovation initiative that provided university staff who wanted to innovate their teaching or research data management with resources (financial, structural, time, and expertise). The results presented on the usage, usability, and usefulness of the University Innovation Canvas (UIC) must therefore be interpreted within the context of this initiative and not as isolated effects of the UIC alone.
Additionally, we note that many of the project team members had already worked on their innovation privately, and the initiative in which the UIC was integrated supported them in realizing or further developing their plans. This means that the innovation initiative also served as a tool of appreciation and motivation for the innovation project teams, and that the teams typically had a very good domain understanding of what they wanted to achieve with their innovation projects. Therefore, we can understand the innovation project teams to be teams of domain experts with a good understanding both of the problems they wanted to address and the solutions they had proposed.
In the section below, we discuss how the UIC was used and reflect upon its role and function in the innovation process.
Within the innovation projects, the UIC was used in iterations, and differences within innovation project teams were reflected in different modes of using the UIC: in print (paper-based) or digital form, individually (filled in by one project team member only) or collaboratively, in co-located (face-to-face) or remote collaboration (e.g., via videoconferencing), and in synchronous or asynchronous collaboration. All teams found a functioning solution for filling in the UIC and were able to relate their project and its contents to the individual tiles. The initial familiarization with the tool was challenging for several teams; however, as teams used the UIC repeatedly, their understanding of the UIC tiles and the perceived value of the individual tiles (and the corresponding bottom-up innovation aspect) for the innovation process improved (as indicated by the increased perception of relevance of single tiles).
Our results also show that the UIC was perceived as particularly helpful for project teams with less combined experience in project management and innovation development—which underlines the function of the UIC as a structural alignment tool. However, the evaluation results suggest that the UIC could be even more effective if the initial effort of understanding the tool could be reduced, e.g., by a more extensive explanation of the tiles and respective examples or by expert support in the first filling-in session. Further, it would increase the usability of the UIC if the tool were available in multiple versions (e.g., digital, printed, and with comment functions) and thus could adapt to the different methods of usage observed in the evaluation.
Overall, the UIC was perceived as a useful tool for supporting the innovation process—on the one hand, by providing structure and guiding development and, on the other hand, by serving as a base for communication and reflection for the project team. In this sense, the UIC was able to replicate many of the central functions attributed to the original Business Model Canvas [36,37,38,39]. Regarding its structure and tiles, the UIC was perceived to be well-adapted to the university context in which the innovations were realized and more suitable for university innovation projects than the Business Model Canvas. The UIC also led teams to plan and think about strategically relevant aspects of innovation development in higher education (sustainability and user orientation) that they would have otherwise not considered or would have considered as less important. These outcomes support, post-hoc, our decision to not directly use the Business Model Canvas in educational technology innovation projects but, rather, to develop a domain-specific canvas (the UIC). Further, this argues that even for research data management innovation projects, which initially had been expected to be relatively similar to typical (software) product innovation projects, a specification may be desirable.
The general outcomes of the innovation initiative also underline the positive results of the evaluation: after five years of realizing and completing innovation projects, 14 out of the 15 developed educational technology innovations are still in use at the university. Many innovations were adapted or developed further: several innovation project teams found more funding after the end of their university-funded project, and some innovations were even developed into start-ups that create value within and beyond the university.
For these reasons, we argue that the UIC is a suitable guidance and alignment tool for a bottom-up university innovation approach in education technology, but we also emphasize that its implementation needs to be accompanied by additional commitment from the university in the form of funding, expertise, general support, and appreciation for motivated innovators [4,12,14].
Naturally, the evaluation of the UIC presented in this paper is subject to certain limitations. Firstly, our sample sizes were generally small and uneven, which is often the case for field studies [46]. Note that the initiative in which the UIC was integrated only funded a small number of educational technology projects (4–6) per innovation cycle; therefore, our sample size, although small, is nevertheless a meaningful representation of the total population of the project teams and project team members. Secondly, with the UIC, we only focus on one aspect of the innovation process, though our innovation initiative included other aspects, such as events, interaction with stakeholders, etc. A holistic evaluation of the initiative concerning all of its aspects would most likely yield more detailed results; however, such an evaluation is beyond the scope of this paper. Thirdly, our results only concern the usage, usability, and usefulness of the UIC in the context of one university’s initiative, and we cannot predict how much value this tool would provide for another university with a different sociotechnical and sociocultural context. A broader implementation and evaluation of the UIC or similar innovation guidance tools at other universities would, therefore, be interesting for future research.

8. Conclusions

In this investigation, we evaluated the University Innovation Canvas (UIC), a tool for guiding and structuring university innovation projects in the area of educational technology. We evaluated the UIC regarding its usage, usability, and usefulness in the context of strategic alignment with university goals for educational technology innovation development: namely, sustainability and user orientation. Our results indicate that the usage of the UIC by project teams varied and that there were four key differences in usage: (i) usage of the UIC as a printed, paper-based tool vs. usage of the UIC as a digital tool (e.g., in Miro), (ii) collaborative usage of the tool in the project team (i.e., filling in and discussing the UIC contents together) vs. individual usage (i.e., only one team member fills in the UIC), (iii) collaboration in physical co-presence (i.e., face-to-face meetings) vs. collaboration online (e.g., via videoconferencing), and (iv) collaboration in synchrony (i.e., joint, simultaneous work) or collaboration in asynchrony (i.e., dividing the work of filling in the canvas and iterative revisions between team members). Further, the usability of the UIC is linked with these differences in usage (e.g., difficulty of adapting the UIC to a suitable collaborative online environment) as well as with the team’s experience in innovation development and project management, as more experienced teams learned how to use the UIC faster (e.g., better understanding of the individual tiles) and were able to work with it more efficiently. The UIC was found to be well-adapted to the university context (compared to the Business Model Canvas) and was successful at fostering alignment with university goals: The UIC was perceived as very useful by project teams—particularly those with less experience in realizing innovation projects. Further, 14 out of 15 innovations are still in use after up to four years of implementation in university courses.
Taken together, these results illustrate that the UIC can be a supportive tool for guiding innovation development and fostering strategic alignment. However, the effective use of the UIC can also be compromised by a lack of experience in implementing innovation; therefore, more effort needs to be devoted to first understanding the UIC and establishing functional workflows for project teams in the future.

Author Contributions

Conceptualization, V.P.-S., M.B. and S.D.; methodology, M.B.; validation, M.B. and M.N.; formal analysis, M.B. and M.N.; investigation, M.B., K.M. and S.D.; resources, S.D., K.M. and M.B.; data curation, M.B.; writing—original draft preparation, M.B. and V.P.-S.; writing—review and editing, S.D., K.M., M.N., M.E. and V.P.-S.; visualization, M.B.; supervision, V.P.-S.; project administration, S.D., K.M. and M.B.; funding acquisition, V.P.-S. and M.E. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported via intramural research funding by Graz University of Technology as part of the Digital TU Graz Marketplace innovation initiative. The publication of this article is supported by TU Graz Open Access Publishing Fund.

Institutional Review Board Statement

The study did not require ethical approval.

Informed Consent Statement

Written informed consent to participate in the evaluation study and use of collected personal data for analysis and publication was obtained from all subjects involved in the study. This also includes the permission to publish this paper.

Data Availability Statement

The datasets presented in this article are not openly available due to privacy restrictions and to protect the anonymity of the subjects participating in this study and their personal data. Requests regarding the datasets should be directed to [email protected].

Acknowledgments

We would like to thank all those who contributed to the initiative “Digital TU Graz Marketplace”: most notably, Hermann Schranzhofer, Markus Koschutnig-Ebner, and Walther Nagler. Open Access Funding by the Graz University of Technology.

Conflicts of Interest

The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
UICUniversity Innovation Canvas

References

  1. Schimank, U.; Winnes, M. Beyond Humboldt? The relationship between teaching and research in European university systems. Sci. Public Policy 2000, 27, 397–408. [Google Scholar] [CrossRef]
  2. Li, Y.; Li, Y.; Castaño, G. The impact of teaching-research conflict on job burnout among university teachers: An integrated model. Int. J. Confl. Manag. 2020, 31, 76–90. [Google Scholar] [CrossRef]
  3. Reponen, T. Is leadership possible at loosely coupled organizations such as universities? High. Educ. Policy 1999, 12, 237–244. [Google Scholar] [CrossRef]
  4. Schneckenberg, D. Understanding the real barriers to technology-enhanced innovation in higher education. Educ. Res. 2009, 51, 411–424. [Google Scholar] [CrossRef]
  5. Etzkowitz, H.; Leydesdorff, L. The Triple Helix – University-Industry-Government Relations: A Laboratory for Knowledge Based Economic Development. EASST Rev. 1995, 14, 14–19. [Google Scholar]
  6. Lawton Smith, H.; Leydesdorff, L. The Triple Helix in the context of global change: Dynamics and challenges. Prometheus 2014, 32, 321–336. [Google Scholar] [CrossRef]
  7. Serdyukov, P. Innovation in education: What works, what doesn’t, and what to do about it? J. Res. Innov. Teach. Learn. 2017, 10, 4–33. [Google Scholar] [CrossRef]
  8. Wilkinson, M.D.; Dumontier, M.; Aalbersberg, I.J.; Appleton, G.; Axton, M.; Baak, A.; Blomberg, N.; Boiten, J.W.; da Silva Santos, L.B.; Bourne, P.E.; et al. The FAIR Guiding Principles for scientific data management and stewardship. Sci. Data 2016, 3, 160018. [Google Scholar] [CrossRef] [PubMed]
  9. Sutanto, E.M. The influence of organizational learning capability and organizational creativity on organizational innovation of Universities in East Java, Indonesia. Asia Pac. Manag. Rev. 2017, 22, 128–135. [Google Scholar] [CrossRef]
  10. Weick, K.E. Educational Organizations as Loosely Coupled Systems. Adm. Sci. Q. 1976, 21, 1. [Google Scholar] [CrossRef]
  11. Ewing, L.A. Rethinking Higher Education Post COVID-19. In The Future of Service Post-COVID-19 Pandemic, 1st ed.; Springer: Berlin/Heidelberg, Germany, 2021; pp. 37–53. [Google Scholar] [CrossRef]
  12. Saichaie, K. Blended, Flipped, and Hybrid Learning: Definitions, Developments, and Directions. New Dir. Teach. Learn. 2020, 2020, 95–104. [Google Scholar] [CrossRef]
  13. Wieser, B.; Bangerl, M.; Karatas, K. Digitale Zukünfte der Universität: Szenarien soziotechnischen Wandels. Österreichische Zeitschrift für Soziologie 2022, 47, 379–402. [Google Scholar] [CrossRef] [PubMed]
  14. Norton, A. Taking University Teaching Seriously; Technical report; Grattan Institute: Carlton, VIC, Australia, 2013. [Google Scholar]
  15. Karagiannis, S. The conflicts between science research and teaching in higher education: An academic’s perspective. Int. J. Teach. Learn. High. Educ. 2009, 21, 75–83. [Google Scholar]
  16. Rubichi, V.; Francescone, P. Authoritarianism and totalitarianism—A case study of multimedia and interdisciplinary teaching. FormaMente 2019, 14, 1–15. [Google Scholar]
  17. Neuwirth, L.S.; Jović, S.; Mukherji, B.R. Reimagining higher education during and post-COVID-19: Challenges and opportunities. J. Adult Contin. Educ. 2020, 27, 141–156. [Google Scholar] [CrossRef]
  18. Bork-Hüffer, T.; Kulcar, V.; Brielmair, F.; Markl, A.; Immer, D.M.; Juen, B.; Walter, M.H.; Kaufmann, K. University Students’ Perception, Evaluation, and Spaces of Distance Learning during the COVID-19 Pandemic in Austria: What Can We Learn for Post-Pandemic Educational Futures? Sustainability 2021, 13, 7595. [Google Scholar] [CrossRef]
  19. Geels, F.W.; Schot, J. Typology of sociotechnical transition pathways. Res. Policy 2007, 36, 399–417. [Google Scholar] [CrossRef]
  20. Brown, L. A lasting legacy? Sustaining innovation in a social work context. Br. J. Soc. Work 2015, 45, 138–152. [Google Scholar] [CrossRef]
  21. Smits, S.; Moriarty, P.; Sijbesma, C. Learning Alliances: Scaling up Innovations in Water, Sanitation and Hygiene; Technical Report 47; IRC: Delft, The Netherlands, 2007. [Google Scholar]
  22. Norman, D.A. The Design of Everyday Things; revised and expanded editioned; Basic Books, A Member of the Perseus Books Group: New York, NY, USA, 2013. [Google Scholar]
  23. Altay, B. User-centered design through learner-centered instruction. Teach. High. Educ. 2014, 19, 138–155. [Google Scholar] [CrossRef]
  24. Ebner, M.; Holzinger, A. Successful implementation of user-centered game based learning in higher education: An example from civil engineering. Comput. Educ. 2007, 49, 873–890. [Google Scholar] [CrossRef]
  25. Burnes, B.; Cooper, C.; West, P. Organisational learning: The new management paradigm? Manag. Decis. 2003, 41, 452–464. [Google Scholar] [CrossRef]
  26. Franke, N.; Lüthje, C. User Innovation. In Oxford Research Encyclopedia of Business and Management; Oxford University Press: Oxford, UK, 2020. [Google Scholar] [CrossRef]
  27. Hippel, E.v. Democratizing Innovation; 1. MIT press paperback ed.; MIT Press: Cambridge, MA, USA, 2006. [Google Scholar]
  28. Reichwald, R.; Piller, F. Open Innovation: Kunden als Partner im Innovationsprozess. In Strategisches Wertschöpfungsmanagement in dynamischer Umwelt; Foschiani, S., Habenicht, W., Wäscher, G., Eds.; Peter Lang: Frankfurt am Main, Germany, 2005. [Google Scholar]
  29. Hsiao, Y.C.; Wu, M.H. How organizational structure and strategic alignment influence new product success. Manag. Decis. 2019, 58, 182–200. [Google Scholar] [CrossRef]
  30. Cummings, R.; Phillips, R.; Tilbrook, R.; Lowe, K. Middle-Out Approaches to Reform of University Teaching and Learning: Champions striding between the top-down and bottom-up approaches. Int. Rev. Res. Open Distrib. Learn. 2005, 6, 1–18. [Google Scholar] [CrossRef]
  31. Osterwalder, A.; Pigneur, Y. Business Model Generation: A handbook for visionaries, game changers and challengers. Afr. J. Bus. Manag. 2010, 5, 1–5. [Google Scholar]
  32. Maurya, A. Running Lean: Iterate from Plan A to a Plan That Works, 2nd ed.; The Lean Series; O’Reilly: Sebastopol, CA, USA, 2012. [Google Scholar]
  33. Bouwman, H.; De Reuver, M.; Heikkilä, M.; Fielt, E. Business model tooling: Where research and practice meet. Electron. Mark. 2020, 30, 413–419. [Google Scholar] [CrossRef]
  34. Schneider, S.; Spieth, P. Business Model Innovation: Towards an Integrated Future Research Agenda. Int. J. Innov. Manag. 2013, 17, 1340001. [Google Scholar] [CrossRef]
  35. Razabillah, N.; Putri Junaedi, S.R.; Maria Daeli, O.P.; Arasid, N.S. Lean Canvas and the Business Model Canvas Model in Startup Piecework. Startupreneur Bus. Digit. SABDA J. 2003, 2, 72–85. [Google Scholar] [CrossRef]
  36. Eppler, M.J.; Hoffmann, F.; Bresciani, S. New Business Models Through Collaborative Idea Generation. Int. J. Innov. Manag. 2011, 15, 1323–1341. [Google Scholar] [CrossRef]
  37. Osterwalder, A. The Business Model Ontology a Proposition in a Design Science Approach. Ph.D. Thesis, Université de Lausanne, Faculté des Hautes études Commerciales, Lausanne, Switzerland, 2004. [Google Scholar]
  38. Täuscher, K.; Abdelkafi, N. Visual tools for business model innovation: Recommendations from a cognitive perspective. Creat. Innov. Manag. 2017, 26, 160–174. [Google Scholar] [CrossRef]
  39. Gassmann, O.; Frankenberger, K. The Business Model Navigator ePub eBook: The Business Model Navigator: 55 Models That Will Revolutionise Your Business; Pearson: London, UK, 2014. [Google Scholar]
  40. Massa, L.; Tucci, C.L.; Afuah, A. A Critical Assessment of Business Model Research. Acad. Manag. Ann. 2016, 11, 73–104. [Google Scholar] [CrossRef]
  41. Felin, T.; Gambardella, A.; Stern, S.; Zenger, T. Lean startup and the business model: Experimentation revisited. Long Range Plan. 2019, 53, 101889. [Google Scholar] [CrossRef]
  42. Schön, S.; Braun, C.; Hohla, K.; Mütze, A.; Ebner, M. The ReDesign Canvas as a Tool for the Didactic-Methodological Redesign of Courses and a Case Study; Association for the Advancement of Computing in Education: Waynesville, NC, USA, 2022. [Google Scholar] [CrossRef]
  43. Dennerlein, S.; Gutounig, R.; Kraker, P.; Kaiser, R.; Rauter, R.; Ausserhofer, J. Assessing Barcamps: Incentives for Participation in Ad-hoc Conferences and the Role of Social Media. In Proceedings of the 13th International Conference on Knowledge Management and Knowledge Technologies, Graz, Austria, 4–6 September 2010; pp. 1–8. [Google Scholar] [CrossRef]
  44. Dennerlein, S.; Pammer-Schindler, V.; Ebner, M.; Getzinger, G.; Ebner, M. Designing a Sandpit- and Co-Design-informed Innovation Process for Scaling TEL Research in Higher Education. In WI2020 Community Tracks; GITO Verlag: Potsdam, Germany, 2020; pp. 49–56. [Google Scholar] [CrossRef]
  45. Graz University of Technology. TU Graz Facts & Figures 2022/23. Verlag der TU Graz. 2023. Available online: https://openlib.tugraz.at/tu-graz-facts-figures-2022-23-2023 (accessed on 2 April 2024). [CrossRef]
  46. de Leeuw, E.D. Dropout in longitudinal studies: Stragegies to limit the problem. Behav. Modif. 2005, 1, 515–518. [Google Scholar]
  47. Kuckartz, U. Qualitative inhaltsanalyse. Methoden, Praxis, Computerunterstützung (Grundlagentexte Methoden, 3., überarbeitete Auflage); Weinheim Beltz Juventa. Zugriff Am.: Weinheim, Germany, 2016; Volume 9, p. 2017. [Google Scholar]
Figure 1. University Innovation Canvas.
Figure 1. University Innovation Canvas.
Education 14 00528 g001
Figure 2. Innovation initiative procedure.
Figure 2. Innovation initiative procedure.
Education 14 00528 g002
Figure 3. Perceived knowledge increase in teams as a result of the UIC meetings.
Figure 3. Perceived knowledge increase in teams as a result of the UIC meetings.
Education 14 00528 g003
Figure 4. What would have improved working with the UIC?
Figure 4. What would have improved working with the UIC?
Education 14 00528 g004
Figure 5. Which steps were helpful in the innovation process?
Figure 5. Which steps were helpful in the innovation process?
Education 14 00528 g005
Figure 6. What did you like about working with the UIC?
Figure 6. What did you like about working with the UIC?
Education 14 00528 g006
Table 1. Evaluation data overview.
Table 1. Evaluation data overview.
Evaluation DataAfter UIC Meeting 1After UIC Meeting 2After UIC Meeting 3Total AnswersTotal Participants
UIC Evaluation Questionnaire2520226730
Extended UIC Evaluation Quest.10682411
Interviews Educational Tech. 5
Interviews Research Data Mgt. 3
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bangerl, M.; Dennerlein, S.; Maitz, K.; Nitschke, M.; Ebner, M.; Pammer-Schindler, V. Supporting Sustainable and User-Oriented Educational Technology Innovation with the University Innovation Canvas. Educ. Sci. 2024, 14, 528. https://doi.org/10.3390/educsci14050528

AMA Style

Bangerl M, Dennerlein S, Maitz K, Nitschke M, Ebner M, Pammer-Schindler V. Supporting Sustainable and User-Oriented Educational Technology Innovation with the University Innovation Canvas. Education Sciences. 2024; 14(5):528. https://doi.org/10.3390/educsci14050528

Chicago/Turabian Style

Bangerl, Mia, Sebastian Dennerlein, Katharina Maitz, Marie Nitschke, Martin Ebner, and Viktoria Pammer-Schindler. 2024. "Supporting Sustainable and User-Oriented Educational Technology Innovation with the University Innovation Canvas" Education Sciences 14, no. 5: 528. https://doi.org/10.3390/educsci14050528

APA Style

Bangerl, M., Dennerlein, S., Maitz, K., Nitschke, M., Ebner, M., & Pammer-Schindler, V. (2024). Supporting Sustainable and User-Oriented Educational Technology Innovation with the University Innovation Canvas. Education Sciences, 14(5), 528. https://doi.org/10.3390/educsci14050528

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop