Next Article in Journal
Using Comprehensive Scenarios to Identify Social–Ecological Threats to Salmon in the Kenai River Watershed, Alaska
Previous Article in Journal
A Study of the Factors Influencing the Residential Preferences of the Elderly in China
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Handrails through the Swamp? A Pilot to Test the Integration and Implementation Science Framework in Complex Real-World Research

1
Manaaki Whenua-Landcare Research, Lincoln 7608, New Zealand
2
Independent Researcher, Hamilton 3200, New Zealand
3
The New Zealand Institute of Plant and Food Research, Lincoln 7608, New Zealand
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(10), 5491; https://doi.org/10.3390/su13105491
Submission received: 16 March 2021 / Revised: 5 May 2021 / Accepted: 6 May 2021 / Published: 14 May 2021

Abstract

:
The socio-environmental challenges the world faces are ‘swamps’: situations that are messy, complex, and uncertain. The aim of this paper is to help disciplinary scientists navigate these swamps. To achieve this, the paper evaluates an integrative framework designed for researching complex real-world problems, the Integration and Implementation Science (i2S) framework. As a pilot study, we examine seven inter and transdisciplinary agri-environmental case studies against the concepts presented in the i2S framework, and we hypothesise that considering concepts in the i2S framework during the planning and delivery of agri-environmental research will increase the usefulness of the research for next users. We found that for the types of complex, real-world research done in the case studies, increasing attention to the i2S dimensions correlated with increased usefulness for the end users. We conclude that using the i2S framework could provide handrails for researchers, to help them navigate the swamps when engaging with the complexity of socio-environmental problems.

1. Introduction

The grand socio-environmental challenges of our time are typically messy, ambiguous, unstable, complex, and context driven [1,2,3]. Conventional disciplinary sciences alone have struggled to tackle these complex issues. In response, various integrative research approaches have been proposed as new ways of investigating these complex problems, such as Integrated Research [4,5], Transdisciplinary Research [1,6,7], Sustainability Science [8], Action Research [9], Undisciplinary Research [10], and Integrative Applied Research [11].
Schon [12] describes these great challenges as ‘swamps’. In Schon’s metaphor, these swamps are overlooked by disciplinary science, which exists on ‘high ground’ where “manageable problems lend themselves to solutions through research-based theory and technique” (p. 28).
The potential benefits of integrative research approaches have been canvassed by many: they achieve a better understanding and framing of the problem [13,14,15]; they provide new knowledge, insight, and innovations as a result of multiple disciplines, research approaches, or relevant knowledge sources coming together [4,5,15,16,17]; they are more able to deal with complexity and uncertainty [18,19]; they produce more relevant, practical, legitimate, and socially robust knowledge [6,15,20,21,22,23]; they can underpin informed policy and decision-making [4]; and they can achieve societal change [24]. Moral obligations for undertaking more integrated research have also been proposed: that those implicated in the science have a role in its production [25] and a universal drive towards greater participation [4].
However, there are also a great number of challenges to this type of research: lack of a common framework [1,11], additional resourcing requirements [26], language barriers [6,27], methodological challenges and contrasting views on what is considered relevant of legitimate data [5,16,19,28,29], challenges of integration [2,11,30], plurality of values and value conflicts [3,31,32], challenges of knowledge transfer [21,30], challenges of evaluation and reproducibility [1,2,33,34], difficulty in finding individuals with disciplinary skills and ability to work in a collaborative team [32], a disconnect between theoreticians and practitioners, tensions with traditional scientific career progression [10,28], and dissonant individual or institutional priorities and cultures [5,7,30,31].
In addition to these challenges, an overriding concern for disciplinary researchers when moving into the swamp is the discrepancy between the technical rigour of their high-ground disciplinary practice and the apparent sloppiness of the swamp [12], where research is structured not by an inquiry or a knowledge gap to be filled but by the task of solving real-world problems [22]. Research that acknowledges an intent to create change [7] requires researchers to shift from being producers of knowledge to being active contributors to a social process of tackling real-world problems [6]. Furthermore, complex systemic problems, such as climate change, are not neutral objects of inquiry [31] and the values of the individuals and institutions involved influence how the topic is researched. Working with complex, systemic problems requires different ways of researching [35]; the factors contributing to success in integrative research are unclear [13,36,37] and scientific practice trails aspirational research design for tackling complex problems [31]. This lag is compounded by the absence of comprehensive guidance on how to research complex real-world problems, which means newcomers often rely on intuition to invent ways of dealing with these challenges [6,11,12,38]. Even researchers with experience in integrative research often fail to systematically reflect on what worked well and what did not, resulting in unnecessary duplication with each new integrative research project [26].
In this paper, as a pilot we test the Integration and Implementation Sciences (i2S) framework [11], an integrative research framework that is targeted at complex real-world problems using seven agri-environmental case studies. We hypothesise that agri-environmental case studies in which concepts from the i2S framework are considered will be perceived as more useful by next users of the research. We assess the perceived usefulness by interviewing and surveying next users from each case study.
By evaluating a published framework, we aim to support those disciplinary scientists who have just arrived ‘in the swamp’ and who have limited capacity and time to seek and critique guidance on integrative research from fragmented and potentially inaccessible body of literature [11,22]. We also seek to augment the i2S framework as a potential set of handrails for researchers in the swamp, with practical examples of successful integrated research activities in agri-environmental research projects.
There are multiple approaches for tackling complex real-world problems, such as systems engineering [39], transdisciplinarity [40,41,42], systemic intervention [43,44], and co-innovation [45]. All of these approaches take a problem orientation, coming from the perspective of tackling the problem and not from a disciplinary approach, and they inherently recognise that there will be differing views on what the problem is. Similarly, all the approaches recognise that not only multiple disciplines but multiple sources of knowledge are important, that achieving change involves more than supplying a simple technical solution, and that working with stakeholders is key to ultimately embedding the desired impact. In addition, some approaches also specifically consider the behaviours of the team and how the team works together. However, a key reason for choosing the i2S framework for this pilot is the explicit way it deals with unknowns. In complex socio-environmental problems, all decisions have to deal with uncertainty, and as the issues become more complex, the different dimensions of uncertainty become more apparent [46]. As decisions still need to be made, this means that an understanding of the inevitable uncertainty of the science is integral to the decisions being made [47]. In contrast to many research traditions where the unknown is seen only as the substrate that is converted into knowledge, the i2S signals a different approach to understanding and managing unknowns through bringing together different approaches to provide a rich understanding and ways of dealing with unknowns [11].

The Integration and Implementation Science Framework

The i2S framework is structured around three domains: synthesising disciplinary and stakeholder knowledge (bringing together what is known and researching a problem using both research and practical experience); understanding and managing unknowns (appreciating that everything about a complex problem cannot be known and unknowns need to be managed and accounted for); and providing integrated research support for policy and practice change (actively supporting the implementation of the research findings).
Five questions applicable across each domain Figure 1. These questions cover (1) what the research is aiming to achieve and who will benefit; (2) what knowledge is synthesised, which unknowns are considered, and which aspects of policy or practice change are targeted; (3) how the knowledge is synthesised, how diverse unknowns are understood and managed, and how integrated research support is provided; (4) what circumstances influenced the research; and (5) an evaluation of how well 1–4 were done [11].

2. Methods

To test the hypothesis that using the concepts in the i2S framework renders research more useful to next users, we test the i2S framework using use seven case studies. Figure 2 shows how the case studies were used to test the hypothesis.

2.1. Choosing Case Studies

A case study investigates real-life phenomena through an analysis of events, conditions, and the relationships between these [48]. In social science, case studies are used when trying to answer how and why questions [49]. In our research, multiple case studies were selected to understand how the i2s framework is perceived by next users.
Seven cases (see Table 1) were chosen through purposive theoretical sampling, meaning the cases were chosen for theoretical and not statistical reasons [50,51]. Our cases cover a range of agri-environmental problems: they were all considered complex, they all had multiple stakeholders, multiple relevant scientific disciplines, recognised that other sources of knowledge beyond scientific knowledge needed to be included and were characterised by there being a real-world problem; in other words, in each case there were people trying to tackle this problem through policy or practice change. There were also practical reasons for our case study selections, for example, we chose projects with which at least one of the authors was familiar or involved, thereby facilitating access to participants or information. Following the guidance of Yin [52], case study data collection used multiple rather than single sources of evidence to provide triangulation, a case study database was created to store information, and a clear chain of evidence was maintained.

2.2. Complexity Assessment

The case studies were all assessed for their degree of complexity [53]. A three-item measure for project complexity was used, developed from Arkesteijn et al.’s [54] model of complexity [53] and using three measures: scientific disagreement, stakeholder disagreement, and systemic stability (how resistant the system is to change). Each dimension was rated on a 12 point scale: the higher the score, the greater the contribution of that dimension to project problem complexity. Following Patton [55], the scale’s levels of complexity was categorised as 1–3 = simple problem, 4–6 = a complicated problem, 7–9 = a complex problem, and 10–12 = a seriously complex problem [53]. Only case studies that scored 7 or over in at least two of the three dimensions were included in the research.

2.3. Deconstructing the i2S Framework

To systematically collect data on the case studies and permit us to score them, we first deconstructed the domains and questions of the i2S framework into discrete ‘elements’. We did this for the first four questions in each domain. Initially, we identified 59 elements across the three domains. These elements became probe questions designed to elicit narrative conversation regarding the implementation of the identified i2S elements in a project. However, after piloting the probe questions on fellow researchers familiar with the project, we discarded several of the element/probes that not did elicit a clear response or that were repetitive of other probes/elements. Thus, 43 elements were identified across the three domains, with domain 1 containing 19, domain 2 containing 9, and domain three containing 15 (for more information on the operationalisation of the framework into elements, see [53,63]). Appendix A presents a table in which the three domains, the four questions in each domain, and the elements associated with each of the questions are listed. The fifth questions in each domain are evaluation questions and do not introduce new elements. To collect data on the fifth questions, the project teams associated with each case study were asked to evaluate their project in each domain, reflecting on the decisions they made, the methods they used, and the outcomes they achieved. The questions used were drawn from Bammer [11] (pp. 55, 99, 141) and further refined in discussion with Gabriele Bammer; they are included in Appendix A.

2.4. Case Study Data Collection

Researchers in each of the seven-case study project teams were invited to a day-long workshop (three workshops in total) where they worked with a facilitator who used the domains, questions, and element probes to elicit a description of each case study through the lens of the i2S framework. For case studies 3, 4, 5, and 6, project team members recorded the project descriptions and evaluations themselves. For case studies 1, 2 and 7, either one of the authors recorded the information (case studies 1 and 7) or the workshop was recorded and transcribed (case study 5). The workshops, and the subsequent process for deriving interview and survey questions, were approved by the Manaaki Whenua–Landcare Research social ethics process (1617/06).

2.5. Scoring the Case Studies

Translating the narrative project descriptions into quantitative scores, and assessing the degree to which each element of the i2S was implemented in the projects, was a subjective rating process, and potentially subject to individual bias. Therefore, we took the approach described below to reduce individual bias and to ensure that analysts were providing comparable evaluations (for further information about this process, see [53]). Three analysts independently assessed each case study description and drew out pertinent information from the qualitative data on how the case study addressed each of the 43 i2S elements and for the team self-evaluation. For each assessment, one of the analysts had some knowledge of the case study. Other analysts had experience of working with qualitative data. The analysts then rated the degree to which each element was addressed based on the evidence in the case study descriptions. They also rated the team self-evaluation of project performance to generate a score for the team’s self-assessment. The analysts used a five-point scale with descriptive anchors at the scale mid-point and the two endpoints. For the assessment of the consideration of i2S elements,
0 = the element was absent or not addressed in the data describing the project;
2 = the element was present or addressed to a moderate degree; and
4 = the element was strongly present or addressed to a high degree.
For the assessment of the team self-evaluation,
0 = very poor;
2 = adequate; and
4 = excellent.
The three analysts then compared their ratings for each of the elements of each project and the team self-evaluation. If their independent ratings for individual elements differed by more than one scale point, analysts discussed and reconsidered the original qualitative data and reached agreement (within one scale point) for that element. The mean of the three analysts’ ratings was taken to provide the final rating for each element. In total, seven analysts were used across the seven case study projects, including three of the authors.
For each case study, an average score was calculated for each of the three i2S domains, and an overall project average score was calculated from the average domain scores. In this way, each of the i2S domains was weighted equally.

2.6. Next-User Assessments

Case study project team members identified all of the immediate ‘next users’ of their research and their roles (e.g., decision-maker, planner, and industry representative). These next users were invited to participate in an evaluation of the case studies. Participants were asked about their experiences of the case study research projects and the usefulness of the research process and outputs for their (next user) purpose through interviews or an online survey (Table 2). The interview (Appendix B) and survey questions (Appendix C) did not ask about each of the 43 i2S elements but related more broadly to one of the three i2S domains. A final section of the survey and interview asked for an overall project assessment. Interviews were conducted by the second and third authors. All interviews were recorded and transcribed, and the chosen analysts analysed the transcriptions. The survey was administered by the second author.
For the interview data, three analysts independently assessed each interview transcription and drew out pertinent information from each interview question response. From this qualitative data, they rated the perceived usefulness of the research for each next user using a five-point scale with descriptive anchors at the scale mid-point and end-points where:
0 = the research was not at all useful;
2 = the research was moderately useful; and
4 = the research was very useful.
The analysts compared their ratings for each interview question as before, reaching agreement (within one scale point) for each rating. The final rating for each interview question was the mean of the three analysts’ ratings.
For the survey data, the next users rated the research processes and outcomes on a five-point scale, and an average of the next user scores was taken.
From both the interview and survey data, an average score of the usefulness of the research was calculated from the questions in each domain for each next user. All the next-user domain scores for each domain were pooled to calculate an average domain score, and an overall project score was calculated from the average domain scores.

2.7. Study Limitations

The limitations to this pilot study include the limited number of available next users of each of the case study projects. Due to the nature of the assessments, the next users needed to be close enough to the project to judge the usefulness of different dimensions of the project. This limited the pool of next users available to participate. For example, in one case there were only two next users. A further limitation of this pilot is the mixed approach to next-user assessments, with two cases using interviews and five cases using surveys. This was the result of time and resource constraints. The impacts were minimised as much as possible by designing the survey based on the interview questions.
These limitations would be addressed by expanding this pilot to more case studies.

3. Results Part 1: The Relationship between Consideration of the i2S Framework and Usefulness

Table 3 and Figure 3 show the degree to which each case study considered i2S framework elements and how useful the research was perceived to be by next users. At the framework level (averaged across domains), all case studies gave moderate to strong consideration of i2S elements and were considered to have been at least moderately useful by next users. At a domain level, all the case studies gave at least moderate consideration to the elements of the i2S and were considered by next users to be at least moderately useful in all domains, with the exception of domain 2 in case study 5.
For all the case studies, there was greater consideration of i2S elements from domains 1 (synthesising science and stakeholder knowledge) and 3 (providing integrated research support) than from domain 2 (understanding and managing diverse unknowns). This was somewhat reflected in the usefulness assessments, as in five of the seven cases studies next users found research projects less useful for understanding and managing diverse unknowns than for synthesising knowledge or providing integrated research support.
Usefulness assessed by next users grew with increased consideration of i2S elements by the project teams. This relationship is seen both at the domain and whole-of-framework level (Table 4). All the correlations are positive and significant (p < 0.05). The next-user interviews and surveys concluded by asking about the overall usefulness of the case study research processes and outputs. The responses from the overall usefulness section were analysed separately to cross-check against the results from the domain and framework evaluation (Table 5). The correlation between the degree of consideration of the i2S elements and the overall assessment by next users was positive and significant (p < 0.05), supporting the domain and framework results.
There are two limitations in this study. Assessing the extent to which a case study considered the elements of the i2S framework may not explicitly capture the quality of the methods used. Although there is a significant correlation (0.74, data not shown here) between the extent of consideration of i2S elements and the project team’s self-evaluation of how well they performed, assessing the extent of consideration only partially captures the quality of the methods used. The second limitation is that case study 4 only obtained next-user information from two next users, although the overall statistical findings were similar when case study 4 results were removed from the analysis.
These results suggest that the increasing consideration of the elements of the i2S framework produces more useful results for a range of next users and also suggest what to pay attention to when undertaking research on complex problems—the ‘hand rails’. The i2S framework is not, however, prescriptive about the ‘how’. The qualitative data, to which we now turn, provide some examples of what doing this type of research well looks like.

4. Results Part 2: The Aspects of the Case Studies That Next Users Found Most Useful

All of the case study projects did some things well. In this section, we use the next-user scores for the interview and survey questions to identify specific aspects of the case studies where the next users found the research activities or products most useful in each of the three i2S domains (scoring ≥ 3.5). We then describe what the relevant case study projects actually did.

4.1. Domain 1 Synthesising Disciplinary and Stakeholder Knowledge

The premise for the first domain in the i2S framework is the recognition that understanding and researching complex real-world problems does not just involve combining knowledge from multiple disciplines but includes relevant stakeholder knowledge as well. The aspects of the case studies the next users found most useful in domain 1 were how the projects had involved the right people and disciplines, how they had identified and understood the problem, and how they sought to understand multiple perspectives.

4.1.1. Involving the Right Disciplines and the Right People

The next users of case studies 2 and 3 considered that the projects had included all the necessary disciplines and stakeholders.
The projects used several methods to identify what knowledge sources and disciplines ought to be included. Case study 2, which was about on-farm nutrient losses, conducted some preliminary regional scale Geographical Information Systems (GIS) modelling of nutrient losses to empirically identify which industries (and therefore which disciplines) should be involved in the project. They noted that “those industries that represented a significant proportion (>80%) of the nitrogen loss in Canterbury were deemed to be most necessary to have involved”. Case study 3, which was about farm nutrient management, considered the system they were trying to modify to identify the right disciplines to include. Through looking at the farm as a system, and the nitrogen cycle, they identified potential intervention points to target in the research and thereby identified the relevant disciplines.
To determine the right stakeholders to involve, case study 2 drew representations of stakeholder interest in and influence over the project to identify those individuals and organisations with the greatest influence and interest in achieving the project goal. Case study 3 undertook a similar stakeholder analysis, but in addition identified where and from whom farmers got their information, in order to include these key information sources. Both projects considered how change would occur as a result of the project and used this to identify others that needed to be engaged in the project.
In both cases, the project teams had a very clear purpose for bringing together different sources of knowledge and people:
“we were trying to do science for policy and … the implementation was really key, so we wanted to have the people … most closely involved in implementation to be right in the heart of the research project making sure that it was credible and relevant and legitimate.”(CS2)
“These guys brought practical knowledge, combined with policy and research knowledge from the rest of the team.”(CS2)
“Needed to include various disciplines and end-users (farmers) to maintain integrity … and accelerate adoption.”(CS3)

4.1.2. Identifying the Problem and Synthesising Knowledge

The next users of case study 7 found the research processes had been particularly effective for identifying the problem. This case study, which was about agricultural innovation, included researchers and agricultural industry partners and used an extensive range of systems tools to define the problem from multiple perspectives and investigate links between problem elements. As one project member recounted,
“We used causal loop diagrams, CATWOE, innovation journey analysis, structures and functions framework, and multi- level perspective.” (Causal loop diagrams are used in System Dynamics to understand the behaviour of a system over time. CATWOE is a method to prompt thinking about a change in the system. Innovation journey analysis is a method for analysing the history of an innovation from inception to implementation. Structures and functions framework is a design framework for designing a change in a system, comprising three classes of variables: function, behaviour, and structure. Multi- level perspective is a framework describing the scaling up and out of innovations from niche, to regime, to landscape scale.)
For understanding the problem, the next users found the ways that case studies 2, 3, and 7 brought together and synthesised information from different knowledge sources particularly useful. All three projects used informal dialogue methods extensively. Through multiple workshops and discussions between researchers and stakeholders, participants gained a joint understanding of the system, of the context, and of the problem, and together interpreted the results.
In addition to the extensive use of dialogue, the three case studies used quite different approaches to bring knowledge together. Case study 2 co-developed a conceptual framework and then used a suite of numerical models to synthesise the different knowledge sources and populate the framework. Case study 3 employed place-based synthesis by using monitor farms to bring together the real-world experience of the farm and research experience and combined this with story-telling by different members of the project. Case study 7 synthesised the project findings for specific stakeholder needs, creating ‘value-add’ documents, where the research being done was brought together and interpreted with particular reference to each stakeholder perspective.

4.1.3. Understanding Different Perspectives

The next users of case studies 1, 2, and 6 reported that they had a good understanding of the diversity of perspectives and that the projects had managed those diverse perspectives well.
In case study 1, which supported a collaborative water policymaking process, the next users reflected that there had been a long and sometimes acrimonious history surrounding management of water, and it seemed as if each group had their own “facts, uncertainties, solutions and aspirations that never seemed to have been all tested together.” To recognise the different perspectives but not privilege any one, the case study project team took a set of community values that had been developed and agreed by the community and used these values to guide technical assessment. They built an assessment framework that predicted the likelihood of a future scenario delivering a range of community values. Through the framework they could be “clear about reflecting the range of aspirations and comparing them completely uniformly”.
For case study 2, there were six agricultural industries in the project who, together, were tasked with defining agricultural Good Management Practice (GMP). In this way, the project had already included a diversity of perspectives, across these different industries, when choosing whom to include. However, the project also recognised that there was important diversity within each of the industries. To capture this diversity in the development of GMP:
“Each of the industries went away and did quite a significant canvassing and consultation and creation exercise with groups of farmers … to canvass perspectives within each of those groups. The project also included a pan sector reference group who were asked to look at things ‘from a broad industry focus’.”
Case study 6, which was about improved irrigation management, used a visual aid to elicit and discuss a range of perspectives. They created a schematic diagram describing the irrigation management landscape. It conceptualised how people viewed irrigation in a wider context, explored who the different key players were, and what the barriers were at different scales. The project team “showed [the] map many times to get other perspectives: ‘this is how I see it’ and ‘what’s your view?’…when meeting with stakeholders, face-to-face meetings, workshops at end of irrigation season”. The schematic was updated as new perspectives were included.

4.2. Domain 2: Understanding and Managing Diverse Unknowns

The premise for the second i2S domain is that unknowns and uncertainties cannot be avoided. Complex real-world problems can be characterised as where “…facts are uncertain, values are in dispute, stakes are high and decisions are urgent” [64], and as the issues become more complex, the different dimensions of uncertainty become more apparent [46]. The next users of case studies 3, 4, and 7 found aspects of the way the projects tried to manage unknowns particularly useful. However, as there was less overall consideration of unknowns than of the other domains, the research activities pertaining to managing unknowns have been described together.
To be able to respond to unknowns as they arose, case study 3 built flexibility into the project. Instead of trying to decide what all of their research steps would be up front, they put ‘go-no go’ decision points into the contract to allow changes in direction throughout the project. They designed-in a mid-term stocktake of all their research to date. In this, they identified some promising options and this led to a refocussing of the research for the remainder of the project. They used this reflection process throughout the project by scheduling regular “research aim or critical step team meetings to reflect, analyse and (re)frame new research.”
Case study 4, which was about managing timber, used a similar adaptive project management approach to case study 3: “do reflect, adapt”. As a way of getting a better understanding of the relevant unknowns and “to help mitigate for unknowns and uncertainty”, the project used workshops to bring together multiple perspectives. In a series of industry engagement workshops, they used “debate and robust discussion” to reflect on the industry issues and the research being done. During one of these workshops, they made an important serendipitous finding, “only discovered by having an appropriate mix of research, development, grower and company in same room working together”. The adaptive management approach they had adopted enabled them to adapt the programme in response to this discovery and go on to deliver benefits for the industry.
Case study 7 began by completing a programme logic that “required considerations of assumptions and foreseeable unknowns that might arise”. They kept track of emerging issues and unknowns from the beginning of the project using three approaches: (1) regular reflection sessions with the project team to explore why progress towards outcomes was working (or not), and they had dedicated reflexive monitors to help the teams through these reflection processes; (2) adaptive management to respond to the emergence of unknowns and maintain relevance; and (3) learning from successes and failures in other innovation projects.

4.3. Domain 3: Providing Integrated Research Support for Policy and Practice Change

The premise for the third domain is the increasing interest in how research influences policy and practice change. The aim is to do research that better supports decision-making, not only by dealing with what is known about a problem but also by helping better manage and account for unknowns and uncertainty. This domain also focuses the attention of researchers on how they can support the change they want to see happen as a result of the research. The aspects of the case studies the next users found most useful in domain 3 were relevant and timely outputs; well communicated outputs; and the way support had been given to the next users.

4.3.1. Producing Relevant and Timely Outputs

The next users of case study 6 thought the project team produced particularly timely and relevant results for them. In the project, farmers were clearly identified as the target audience and were critical to the project outcomes of improved irrigation management. The project established what the farmers needed to help them manage their irrigation better. As a result, the project posted daily website updates and provided a portal for farmers to view their own farm data as well as data from other contributing farms. The project team also helped farmers make sense of their data through phone calls, training sessions, and emails.

4.3.2. Results Communicated Helpfully

The next users of case studies 2 and 3 found the research outputs were usefully communicated.
Case study 2 interacted with the target audiences from day one of the project and took an approach of extensive communication, using workshops, presentations, models, reports, briefs, one-on-one and group discussions, and attending policy development meetings to introduce the science where necessary:
“Compared to a group of people going off doing a piece of research and delivering a report or a product [at the end] and walking away, it couldn’t really have been further from that … having the next users on the team helps”.
They communicated the science at different levels of complexity. They made good use of different organisations, interests, and expertise in the team. They were mindful that team members “were credible or legitimate in different roles” and therefore involved “almost all of the members of the research team … in the delivery of the research findings in one way or the other, whether they were presenting to their peers, whether they were presenting to industry, or whether they were involved in the policy development.”
Case study 3 developed a clear communications plan for their target audiences and identified and planned communication events such as “workshops, field days and project team meetings”. They also developed capacity within the host organisation to be able to integrate and communicate findings more widely.

4.3.3. Supporting Policy and Practice Change

The next users of case studies 1 and 3 found that the research processes and outputs were particularly useful for supporting them in policy or practice changes. In both case studies, the target audiences and the desired outcomes were clearly identified by the project teams, and right from the start both teams considered how they would support the implementation of the project results.
At the start of their project, the case study 1 team had “various discussions with [next users] about how we might best work together”, including devices like field trips and “re-structuring the way we undertook the science in explicit recognition of the need to recognise end-users’ needs and interests and values.” The team were also very deliberate in how they framed their role:
“We tried to frame the research findings as a ‘service’ trying to make transparent the consequences and uncertainties to allow [next users] to make an informed decision. We also resisted the invitations to ‘tell us what the answer is’. The way we described [the results] was in the form of a coloured matrix with the likelihood of meeting the outcomes [desired] by the community. This meant that different scenarios could easily be compared.”
The team paid particular attention to who was providing support to the next users. Described as brokers, these individuals
“were specifically identified as having skills, attributes and interest in translation and communication, [were] non-defensive in their approach and had empathy with target audience and were also comfortable with uncertainty”.
In case study 3, before the project was funded, the team considered the changes they wanted to achieve and “workshopped the necessary steps to support farmers in transition” and used these insights to inform the project proposal. They used a programme logic approach to make transparent how they think change happens and tested it with the monitor farmers early on. To support these changes, the project endeavoured to have all “knowledge in the public domain [and] data available to all parties”, which required them to resolve intellectual property issues between organisations in the early stages of the project. They also formed a relationship with the owners of an independent model that is used widely in the regulation of farm nutrient losses, to expedite the inclusion of the research in the model “in order to increase uptake and reward farmers for doing the right thing.”

5. Discussion

The findings from this pilot suggest that the extent to which the concepts in the i2S framework are considered can influence the usefulness of research outputs to next users and, as such, provisionally suggest that the i2S concepts can be considered effective ‘handrails through the swamp’ in agri-environmental research (in particular, research projects that have multiple stakeholders, relevant scientific disciplines, and knowledge sources, and in cases where there is a real-world problem that is being tackled through policy or practice change, making management of unknowns a critical consideration). By examining the case studies through the lens of the i2S framework with insights from the next users, some useful examples of how these handrails are used to good effect surfaced.

5.1. Domain 1 Synthesising Disciplinary and Stakeholder Knowledge

The importance of acknowledging different sources of knowledge beyond traditional science disciplines is well recognized [5,30,34,65,66]. In domain 1, synthesising disciplinary and stakeholder knowledge, we see the strongest correlation between the case study research teams’ use of concepts in the i2S framework and the perceived usefulness by next users. It is likely that the researchers involved already had experience in this area.
The case study projects considered most useful were very clear about why they needed different people, perspectives, expertise, and knowledge sources. The importance of this clarity is well recognised [67,68], and the i2S framework brings the question to the fore by asking, ‘What is the research for and who is it intended to benefit?’ What the framework is silent on, and what would be useful to include, is consideration at this early stage of who might lose out from the research [69].
The benefits of including other perspectives and knowledges envisaged by these case studies relate not only to gaining a better understanding of the system [34] but also practical legitimacy, increased rigour, and greater implementation [14,23,70]. Considering a breadth of perspectives contributes to the quality and impact of research according to Hansson and Polk [67], who also caution researchers to clearly identify which perspectives are needed and why, and not just go for the wholesale involvement of every player. Including both scoping and boundary setting in the i2s framework guides researchers to scope the disciplines, types of knowledge, and stakeholders that could be involved and then make transparent decisions on what and whom to include. Those case studies considered to have chosen stakeholders and disciplines well all used some type of independent analysis to help determine the disciplines and stakeholders needed.
How different types of knowledge are brought together is a principal challenge in integrated research [2,30,32]. The i2s framework acknowledges broad classes of synthesis such as dialogue and modelling. However, the case study projects deemed to have synthesised knowledge best used multiple methods of synthesis simultaneously. The case studies built on the benefits of individual synthesis methods such as modelling with stakeholder participation [17]; co-development [71]; place-based synthesis to anchor the project in the context of the user, e.g., [67]; and narratives to synthesise multiple perspectives [72] and communicate complex information [73], and combined the results in various ways to reflect that different people understand information in different ways, e.g., [74]. As no method of synthesis is complete, all focus on different aspects. Using multiple methods also has the advantage that they complement each other [75,76].

5.2. Domain 2 Understanding and Managing Diverse Unknowns

In domain 2, understanding and managing diverse unknowns, we see a weaker correlation between the case study research teams’ use of concepts in the i2S framework and the perceived usefulness by next users (cf domain 1 and 3). Bammer [11] (p. 16) notes that this area of work generally receives much less attention in integrative research projects, and these case studies support this assertion, as all the case studies spent considerably less time and effort in activities of domain 2 than in domains 1 and 3.
Projects that were considered most successful in managing unknowns deliberately built adaptation into their project design. They recognised that impactful research must embrace uncertainty [8,47] and emergence [2]. They used reflection [4,77] and adaptive management [78] to respond to emergent situations or circumstances and serendipitous discoveries. Some projects used minimal specifications [8] in their contracting to support this flexibility and adaptiveness. An additional benefit of regular reflection was that it supported learning in the projects [7]. The i2S framework does not explicitly mention the importance of a learning approach, e.g., [8,23,38], but many of the projects deliberately created space for learning to occur between the research team, extended network, and next users, through devices such as field trips, gathering of a community of practice, or even something as simple as regular in-person gatherings of researchers, partners, and end users to jointly make sense of the research results.

5.3. Domain 3: Providing Integrated Research Support for Policy and Practice Change

In domain 3, integrated research support for policy and practice change, we see a strong correlation between the case study research teams’ use of concepts in the i2S framework and the perceived usefulness by next users. Along with domain 1, this was a clear focus for the case study projects.
Those authors of the case studies that were considered most useful for supporting policy and practice change thought about the impact they wanted to have, the influence they wanted to exert, and how their project would support change, right from the start. They had a clear idea of the reasons why they needed to include a range of people, perspectives, and knowledge sources, and a clear idea of whom and what they were targeting and how those in that system made change. Instead of impact being considered at the end of a project and viewed largely as a communication exercise, these case studies designed the research around it, influencing what they did and whom they did it with. This designing for impact [7,30,79] has profound implications for the conception, design, implementation, and evaluation of individual research projects. Although the numbering of the domains might suggest that consideration of implementation and change happens towards the end of the project, Bammer (2013) notes that the framework is flexible in terms of the order in which it is used and it works best when implementation starts to be considered during project design.
The same case studies endeavoured to understand how their target audiences made decisions, including how they and other stakeholders considered unknowns. Lemos and Morehouse [71] note that useable science, as opposed to just useful science, directly reflects constituent needs, is understandable to users, is available at the time and place needed, and is expressed in a medium that is useful for next users. These case studies were attentive to the timing and frequency of decision-making, whether for policy or on the ground practice change. They explicitly sought to understand how different target audiences wanted to engage and what type of output would be meaningful for them. As Polk [21] found, the projects geared research synthesis and outputs to target audience needs to increase compatibility with the target audience’s decision-making processes. The third domain in the i2S framework guides researchers to develop this greater understanding of the target audiences and their world.
In domain 3, researchers are prompted to consider not just the outputs of the research but also how that support is given. When choosing their teams, the case studies considered most useful viewed a range of skills and attributes to be important, such as brokering, integration, translation, reflection, communication skills, and the ability to work with difference [13,32,80], as well as credibility, empathy, humility, and an understanding of the decision-making and policy context [19].

6. Conclusions

For disciplinary scientists, the literature on integrative research remains fragmented and difficult to navigate. For this reason, we chose to evaluate an existing integrative research framework, the i2s framework, to try and provide disciplinary scientists with evidence of the framework’s efficacy in tackling complex socio-environmental problems. We discovered that research teams who considered the concepts of the i2s framework produced research that was considered useful by their next users. These findings suggest that disciplinary scientists can use the i2s framework to generate useful agri-environmental research but that further evaluation of the i2s framework is necessary in applications beyond agri-environmental research to understand its broader efficacy for research in complex problems. Further research is also needed on when the framework is deployed in research projects to confirm its efficacy.

Author Contributions

Conceptualization, M.R.-W., B.S. and R.R.-W.; Methodology, M.R.-W., B.S. and R.R.-W.; Formal Analysis, M.R.-W., B.S. and R.R.-W. Investigation, M.R.-W., B.S. and R.R.-W.; Writing—Original Draft, M.R.-W., N.K. and R.R.-W.; Writing—Review and Editing, M.R.-W., B.S., R.R.-W. and N.K.; Funding Acquisition, M.R.-W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Ministry for Business Innovation and Employment under the Strategic Science Investment Fund; and Ministry for Business Innovation and Employment Our Land and Water National Science Challenge under grant C10X1507.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Social Ethics Committee of Manaaki Whenua application number 1617/06), approved on 1 July 2016.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

In accordance with the social ethics approval, the underlying data is not publicly available.

Conflicts of Interest

The authors declare no competing interest.

Appendix A. i2S Framework Elements

DomainQuestionElement
Synthesising disciplinary and stakeholder knowledgeWhat was the synthesis of disciplinary and stakeholder knowledge aiming to achieve and who is intended to benefit?Purpose of combining disciplinary and Stakeholder (SH). Knowledge documented,
Disciplinary and SH beneficiaries documented.
Which disciplinary and stakeholder knowledge was considered?Systems tools used to help define the ‘problem’ and investigate links between problem elements.
A recognised method used to develop a system view.
Identification of different views on defining the problem.
Identification of all the disciplines that could be relevant.
Identification of all the stakeholders that could be relevant.
A recognised methodology used for identification of relevant stakeholders and disciplines.
Relevant science disciplines included and clarity on how decisions were made on what research was undertaken.
Relevant stakeholders included and clarity on how decisions were made on which SH were considered relevant to include.
All the relevant perspectives within each stakeholder group identified.
Consideration given to how the problem was described or framed.
Differences or conflicts managed.
How was the disciplinary and stakeholder knowledge synthesised, by whom, and when?Knowledge from the stakeholder and science participants synthesised using either formal or informal methods.
Consideration given to who was the most appropriate person (s) to lead the integration and synthesis.
Synthesising of knowledge (stakeholder with stakeholder, discipline with discipline, and stakeholder with discipline) occurs at multiple stages in project.
What circumstances influenced the synthesis of disciplinary and stakeholder knowledge?Overall context (and changes in context) influencing research project taken into account.
Consideration given to the authorisation and legitimacy of the project.
Project barriers and facilitators managed.
Understanding and managing diverse unknownsWhat was the understanding and management of diverse unknowns trying to achieve?Clarity within research team, research stakeholders, and target audience on why managing for unknowns is important.
Which unknowns/uncertainties/risks considered?Systematic consideration of unknowns (at outset and on-going through project life).
Opportunities and risks associated with undertaking the project managed.
How were the recognised unknowns and uncertainties managed or responded to?Tools or processes used to help cope with uncertainty in research or implementation process, e.g., the precautionary principle, scenarios, sensitivity analysis, hedging, and adaptive management (acceptance approach).
Identification and management of unknowns that were irreducible within the scope of the project.
Surprises or the unexpected managed.
Consideration of participants’ tacit knowledge (researchers, research stakeholders, and target audience) and management of any conflicts.
Communication of uncertainty in research outputs and implications for policy or practice change.
What circumstances influenced the management of unknownsManagement of project barriers to and facilitators of understanding, and managing unknowns within research and target audience organisations.
III Providing integrated research support for policy and practice changeWhat is the integrated research support aiming to achieve, and who is intended to benefit?Practice or policy change intent of the project clearly stated.
Target audience (e.g., next-user, end-user) government, business, or civil society clearly stated.
Which aspects of policy and practice are targeted by the provision of integrated research support?Consideration given to the target change as part of a system (e.g., if it was a change of regional council policy, was the way the regional council makes policy decisions considered, position in the policy cycle, reaction of lobby groups and community groups?)
Identification of different systems views.
Consideration given to how change happens-explicit or implicit (e.g., program logic, causal chain modelling, and theory of change).
Identification of all the opportunities and target audiences relevant for the targeted change.
Consideration of how research findings were framed or described for the target audience.
Identified differences or conflicts in values between research team and target audience managed.
How was integrated research support provided, by whom, and when?Consideration of what methods were used to deliver research findings to the target audience.
Consideration given to who was involved in the delivery of the research findings.
Interaction with the target audience occurs at multiple stages in project.
What circumstances might/did influence the provision of integrated research support for policy and practice change?Overall context (and changes in context) influencing target change and target audience managed.
Consideration given to the authorisation and legitimacy of the project to interact with target audience.
Identification and management of difficulties in moving from doing the research (domain 1) in to trying to implement (domain 3).
Project barriers and facilitators with target audience managed.

Appendix B. Interview Questions

Domain 1–Synthesising science and stakeholder knowledge
  • Did the research include all the stakeholders and knowledge sources you thought necessary (in order for the results to be useful to you)? If not, who was excluded? And why is this important?
  • What impact do you think that disciplinary and stakeholder selection had on research results?
  • Do you think that mātauranga Māori was well enough integrated in the research outputs (in order for the results to be useful to you)? Why or why not? (Only for case study 1)
  • Did the research provide a useful synthesis of all the relevant knowledge? If not, why not? Was something important excluded? If so, why?
  • How did your involvement with the research improve your understanding of social, environmental, cultural, and economic interactions? (case study 1)/How did your involvement with the research improve your understanding of good management practice, and of land use, soil, and climate interactions? (case study 2)
  • How did your involvement with the research improve your understanding of the diversity of stakeholder perspectives and values?
  • Were your contributions well enough reflected in the research outputs?
Domain 2–understanding and managing diverse unknowns
  • Do you think that the uncertainties in the research outputs were adequately described and managed?
  • Were the uncertainties communicated helpfully for use in policy development?
  • Did any groups attempt to exploit uncertainties or unknown factors in the research process? If yes, how were those attempts managed?
  • Were any unknowns that you considered important not addressed or inadequately addressed? What were they and how do you think this impacted the research outputs?
  • How well/adequately did the research process involve you in identifying unknowns?
Domain 3–Integrated research support for policy and practice change
  • How well did the research process support your decision-making/policy development process?
  • Did the research process produce relevant outputs and information for the decision-makers/policy-makers and did they (the outputs and information) come at the right time?
  • Were the research outputs communicated helpfully for policy development?
  • Were the research outputs helpful for integrating mātauranga Māori in the policy? (Only for case study 1).
  • Do you think that the research outputs were presented impartially (as opposed to advocating for a position)?
Additional evaluation questions
  • What is your overall evaluation of the research process and its outputs for the development and implementation of policy?
  • How legitimate was the research process?
  • How credible were the research results?
  • How relevant were the research results to subsequent policy development?
  • What would have led to a greater sense of legitimacy, credibility, and relevance regarding the research (if not covered above)?

Appendix C. Survey Questions

  • Note: the questions below were generic to all projects surveyed. However, each project was sent a personalised questionnaire that referred to their particular project.
  • My job/role/next-user/end-user group is: _____________________
  • My level of familiarity with XXX research project is:
  • 0 = not at all familiar
  • 1 = slightly familiar
  • 2 = somewhat familiar
  • 3 = moderately familiar
  • 4 = very familiar
  • My frequency of engagement with the XXX research team was:
  • 0 = never
  • 1 = rarely
  • 2 = occasionally/sometimes
  • 3 = a moderate amount
  • 4 = a great deal
Domain1: Synthesising stakeholder and disciplinary knowledge
  • As a next or end-user of the XXX project please rate how useful the knowledge synthesis and research outputs were for understanding the problem and proposed solutions. Not at all useful = 0, moderately useful = 2, very useful = 4
  • As a next or end-user of the XXX project, please rate how useful the knowledge synthesis and research outputs were for helping you understanding the perspectives and values of all the affected or interested parties. Not at all useful = 0, moderately useful = 2, and very useful = 4
  • Was the project relevant to Māori? NA/Yes. If yes, how useful was the integration of mātauranga Māori into the research outputs for understanding the Māori perspective to the research problem and proposed solutions? Not at all useful = 0, moderately useful = 2, very useful = 4
  • How usefully was the perspective of your industry sector (next-, end-user stakeholder group) reflected in the research knowledge synthesis and outputs? Not at all useful = 0, moderately useful = 2, very useful = 4
  • Were any affected or interested parties and/or disciplinary knowledge sources not well represented or excluded from the research outputs? If yes, who and why should they have been better represented?
Domain 2: Understanding and managing diverse unknowns
  • How useful was the analysis of uncertainties and unknowns, which might influence the problem issue and proposed solutions, in the research outputs? Not at all useful = 0, moderately useful = 2, and very useful = 4
  • How useful was the communication of uncertainties and unknowns, in the research outputs, for use in policy development or practice implementation? Not at all useful = 0, moderately useful = 2, very useful = 4
  • Were any uncertainties or unknowns that you considered important not addressed or inadequately addressed in the research outputs? What were they and how do you think this impacted the usefulness of the research outputs?
Domain 3: support for policy/practice implementation
  • How useful were the research project outputs for supporting policy development or practice implementation? Not at all useful = 0, moderately useful = 2, and very useful = 4
  • How useful were the research outputs in term of providing timely, relevant information for policy-makers or practitioners? Not at all useful = 0, moderately useful = 2, and very useful = 4
  • How useful were the research outputs communicated for policy development or practice implementation? Not at all useful = 0, moderately useful = 2, and very useful = 4
  • If mātauranga Māori was relevant to the project, how useful were the research outputs for integrating mātauranga Māori into policy development or practice change? Not at all useful = 0, moderately useful = 2, and very useful = 4, NA
  • How objectively were the research results presented? Not at all objective = 0, moderately objective = 2, and very objective = 4
Additional review questions
  • What is your overall evaluation of the usefulness of the research project outputs for policy development or practice implementation? Not at all useful = 0, moderately useful = 2, and very useful = 4
  • How legitimate was the research process and outputs? Not at all legitimate = 0, moderately legitimate = 2, and very legitimate = 4
  • How credible were the research results and outputs? Not at all credible = 0, moderately credible = 2, and very credible = 4
  • How relevant were the research results to subsequent policy development or practice implementation? Not at all relevant = 0, moderately relevant = 2, and very relevant = 4
  • What would have led to a greater sense of legitimacy, credibility, and relevance of the research outputs?

References

  1. Roux, D.J.; Stirzaker, R.J.; Breen, C.M.; Lefroy, E.C.; Cresswell, H.P. Framework for participative reflection on the accomplishment of transdisciplinary research programs. Environ. Sci. Policy 2010, 13, 733–741. [Google Scholar] [CrossRef]
  2. Gaziulusoy, A.I.; Ryan, C.; McGrail, S.; Chandler, P.; Twomey, P. Identifying and addressing challenges faced by transdisciplinary research teams in climate change research. J. Clean. Prod. 2016, 123, 55–64. [Google Scholar] [CrossRef]
  3. Herrero, P.; Dedeurwaerdere, T.; Osinski, A. Design features for social learning in transformative transdisciplinary research. Sustain. Sci. 2019, 14, 751–769. [Google Scholar] [CrossRef]
  4. Dovers, S. Clarifying the imperative of integration research for sustainable environmental management. J. Res. Pract. 2005, 1, 1–19. [Google Scholar]
  5. Mauser, W.; Klepper, G.; Rice, M.; Schmalzbauer, B.S.; Hackmann, H.; Leemans, R.; Moore, H. Transdisciplinary global change research: The co-creation of knowledge for sustainability. Curr. Opin. Environ. Sustain. 2013, 5, 420–431. [Google Scholar] [CrossRef] [Green Version]
  6. Pohl, C.; Hadorn, G.H. Methodological challenges of transdisciplinary research. Nat. Sci. Soc. 2008, 16, 111–121. [Google Scholar] [CrossRef] [Green Version]
  7. Mitchell, C.; Cordell, D.; Fam, D. Beginning at the end: The outcome spaces framework to guide purposive transdisciplinary research. Futures 2015, 65, 86–96. [Google Scholar] [CrossRef] [Green Version]
  8. Van Kerkhoff, L. Developing integrative research for sustainability science through a complexity principles-based approach. Sustain. Sci. 2014, 9, 143–155. [Google Scholar] [CrossRef]
  9. Stokols, D. Toward a science of transdisciplinary action research. Am. J. Community Psychol. 2006, 38, 63–77. [Google Scholar] [CrossRef]
  10. Haider, L.J.; Hentati-Sundberg, J.; Giusti, M.; Goodness, J.; Hamann, M.; Masterson, V.A.; Meacham, M.; Merrie, A.; Ospina, D.; Schill, C.; et al. The undisciplinary journey: Early-career perspectives in sustainability science. Sustain. Sci. 2018, 13, 191–204. [Google Scholar] [CrossRef] [PubMed]
  11. Bammer, G. Disciplining Interdisciplinarity: Integration and Implementation Sciences for Researching Complex Real-World Problems; ANU Press: Acton, Australia, 2013. [Google Scholar]
  12. Schön, D.A. Knowing-In-Action: The New Scholarship Requires a New Epistemology. Chang. Mag. High. Learn. 1995, 27, 27–34. [Google Scholar] [CrossRef]
  13. Small, B.; Payne, T.; Munguia, O.M.D.O. Developing Reliable and Valid Measures for Science Team Process Success Factors in Transdisciplinary Research. Int. J. Interdiscip. Organ. Stud. 2015, 10, 1–22. [Google Scholar] [CrossRef]
  14. Huutoniemi, K.; Klein, J.T.; Bruun, H.; Hukkinen, J. Analyzing interdisciplinarity: Typology and indicators. Res. Policy 2010, 39, 79–88. [Google Scholar] [CrossRef]
  15. Frescoln, L.M.; Arbuckle, J.G. Changes in perceptions of transdisciplinary science over time. Futures 2015, 73, 136–150. [Google Scholar] [CrossRef]
  16. Pennington, D.D.; Simpson, G.L.; McConnell, M.S.; Fair, J.M.; Baker, R.J. Transdisciplinary Research, Transformative Learning, and Transformative Science. Bioscience 2013, 63, 564–573. [Google Scholar] [CrossRef] [Green Version]
  17. Voinov, A.; Kolagani, N.; McCall, M.K.; Glynn, P.D.; Kragt, M.E.; Ostermann, F.O.; Pierce, S.A.; Ramu, P. Modelling with stakeholders—Next generation. Environ. Model. Softw. 2016, 77, 196–220. [Google Scholar] [CrossRef]
  18. Ayre, M.; Nettle, R. Doing integration in catchment management research: Insights into a dynamic learning process. Environ. Sci. Policy 2015, 47, 18–31. [Google Scholar] [CrossRef] [Green Version]
  19. Berkett, N.; Fenemor, A.; Newton, M.; Sinner, J. Collaborative freshwater planning: Changing roles for science and scientists. Australas. J. Water Resour. 2018, 22, 39–51. [Google Scholar] [CrossRef]
  20. Carew, A.L.; Wickson, F. The TD Wheel: A heuristic to shape, support and evaluate transdisciplinary research. Futures 2010, 42, 1146–1155. [Google Scholar] [CrossRef]
  21. Polk, M. Achieving the promise of transdisciplinarity: A critical exploration of the relationship between transdisciplinary research and societal problem solving. Sustain. Sci. 2014, 9, 439–451. [Google Scholar] [CrossRef]
  22. Fernandez, R.J. How to be a more effective environmental scientist in management and policy contexts. Environ. Sci. Policy 2016, 64, 171–176. [Google Scholar] [CrossRef]
  23. Lang, D.J.; Wiek, A.; Bergmann, M.; Stauffacher, M.; Martens, P.; Moll, P.; Swilling, M.; Thomas, C.J. Transdisciplinary research in sustainability science: Practice, principles, and challenges. Sustain. Sci. 2012, 7, 25–43. [Google Scholar] [CrossRef]
  24. Maasen, S.; Lieven, O. Transdisciplinarity: A new mode of governing science? Sci. Public Policy 2006, 33, 399–410. [Google Scholar] [CrossRef] [Green Version]
  25. Krueger, T.; Page, T.; Hubacek, K.; Smith, L.; Hiscock, K. The role of expert opinion in environmental modelling. Environ. Model. Softw. 2012, 36, 4–18. [Google Scholar] [CrossRef]
  26. Tress, G.; Tress, B.; Fry, G. Analysis of the barriers to integration in landscape research projects. Land Use Policy 2007, 24, 374–385. [Google Scholar] [CrossRef]
  27. Stock, P.; Burton, R.J.F. Defining Terms for Integrated (Multi-Inter-Trans-Disciplinary) Sustainability Research. Sustainability 2011, 3, 1090–1111. [Google Scholar] [CrossRef] [Green Version]
  28. Harris, F.; Lyon, F. Transdisciplinary Environmental Research: A Review of Approaches to Knowledge Co-Production. 2014. Available online: http://researchprofiles.herts.ac.uk/portal/files/12138376/Harris_and_Lyon_Nexus_thinkpiece_002.pdf (accessed on 30 April 2021).
  29. Duncan, R. Ways of knowing—Out-of-sync or incompatible? Framing water quality and farmers’ encounters with science in the regulation of non-point source pollution in the Canterbury region of New Zealand. Environ. Sci. Policy 2016, 55, 151–157. [Google Scholar] [CrossRef] [Green Version]
  30. Konig, B.; Diehl, K.; Tscherning, K.; Helming, K. A framework for structuring interdisciplinary research management. Res. Policy 2013, 42, 261–272. [Google Scholar] [CrossRef]
  31. Popa, F.; Guillermin, M.; Dedeurwaerdere, T. A pragmatist approach to transdisciplinarity in sustainability research: From complex systems theory to reflexive science. Futures 2015, 65, 45–56. [Google Scholar] [CrossRef] [Green Version]
  32. Pennington, D. A conceptual model for knowledge integration in interdisciplinary teams: Orchestrating individual learning and group processes. J. Environ. Stud. Sci. 2016, 6, 300–312. [Google Scholar] [CrossRef]
  33. Stokols, D.; Fuqua, J.; Gress, J.; Harvey, R.; Phillips, K.; Baezconde-Garbanati, L.; Unger, J.; Palmer, P.; Clark, M.A.; Colby, S.M.; et al. Evaluating transdisciplinary science. Nicotine Tob. Res. 2003, 5, 21–39. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  34. Brandt, P.; Ernst, A.; Gralla, F.; Luederitz, C.; Lang, D.J.; Newig, J.; Reinert, F.; Abson, D.J.; von Wehrden, H. A review of transdisciplinary research in sustainability science. Ecol. Econ. 2013, 92, 1–15. [Google Scholar] [CrossRef]
  35. Duncan, R.; Robson-Williams, M.; Nicholas, G.; Turner, J.A.; Smith, R.; Diprose, D. Transformation Is ‘Experienced, Not Delivered’: Insights from Grounding the Discourse in Practice to Inform Policy and Theory. Sustainability 2018, 10, 3177. [Google Scholar] [CrossRef] [Green Version]
  36. Mattor, K.; Betsill, M.; Huayhuaca, C.; Huber-Stearns, H.; Jedd, T.; Sternlieb, F.; Bixler, P.; Luizza, M.; Cheng, A.S. Transdisciplinary research on environmental governance: A view from the inside. Environ. Sci. Policy 2014, 42, 90–100. [Google Scholar] [CrossRef] [Green Version]
  37. Duncan, R.; Robson-Williams, M.; Fam, D. Assessing research impact potential: Using the transdisciplinary Outcome Spaces Framework with New Zealand’s National Science Challenges. Kōtuitui 2020, 15, 217–235. [Google Scholar] [CrossRef] [Green Version]
  38. Jahn, T.; Bergmann, M.; Keil, F. Transdisciplinarity: Between mainstreaming and marginalization. Ecol. Econ. 2012, 79, 1–10. [Google Scholar] [CrossRef]
  39. Locatelli, G.; Mancini, M.; Romano, E. Systems Engineering to Improve the Governance in Complex Project Environments. Int. J. Proj. Manag. 2014, 32, 1395–1410. [Google Scholar] [CrossRef]
  40. Klein, J.T. Prospects for transdisciplinarity. Futures 2004, 36, 515–526. [Google Scholar] [CrossRef]
  41. Jahn, T. Transdisciplinarity in the practice of research. In Transdisziplinäre Forschung: Integrative Forschungsprozesse Verstehen und Bewerten; Campus Verlag: Frankfurt, Germany, 2008; pp. 21–37. [Google Scholar]
  42. Apgar, J.M.; Argumedo, A.; Allen, W. Building Transdisciplinarity for Managing Complexity: Lessons from Indigenous Practice. Int. J. Interdiscip. Soc. Sci. Annu. Rev. 2009, 4, 255–270. [Google Scholar] [CrossRef]
  43. Midgley, G. Systemic Intervention. In the Sage Handbook of Action Research, 3rd ed.; Bradbury-Huang, H., Ed.; Sage: London, UK, 2015. [Google Scholar]
  44. Midgley, G.; Ahuriri-Driscoll, A.; Foote, J.; Hepi, M.; Taimona, H.; Rogers-Koroheke, M.; Baker, V.; Gregor, J.; Gregory, W.; Lange, M.; et al. Practitioner identity in systemic intervention: Reflections on the promotion of environmental health through Māori community development. Syst. Res. Behav. Sci. 2007, 24, 233–247. [Google Scholar] [CrossRef]
  45. Vereijssen, J.; Srinivasan, M.; Dirks, S.; Fielke, S.; Jongmans, C.; Agnew, N.; Klerkx, L.; Pinxterhuis, I.; Moore, J.; Edwards, P. Addressing complex challenges using a co-innovation approach: Lessons from five case studies in the New Zealand primary sector. Outlook Agric. 2017, 46, 108–116. [Google Scholar] [CrossRef]
  46. Bammer, G.; Smithson, M. Uncertainty and Risk: Multidisciplinary Perspectives; Earthscan: Routledge, UK, 2008. [Google Scholar]
  47. Robson-Williams, M.; Norton, N.; Davie, T.; Taylor, K.; Kirk, N. The Changing Role of Scientists in Supporting Collaborative Land and Water Policy in Canterbury, New Zealand. Case Stud. Environ. 2018, 2, 1–5. [Google Scholar] [CrossRef]
  48. Burns, W. The Case for Case Studies in Confronting Environmental Issues. Case Stud. Environ. 2017, 1, 1–4. [Google Scholar] [CrossRef] [Green Version]
  49. Baxter, P.; Jack, S. Qualitative case study methodology: Study design and implementation for novice researchers. Qual. Rep. 2008, 13, 544–559. [Google Scholar]
  50. Seawright, J.; Gerring, J. Case Selection Techniques in Case Study Research: A Menu of Qualitative and Quantitative Options. Polit. Res. Q. 2008, 61, 294–308. [Google Scholar] [CrossRef]
  51. Shareia, B. Qualitative and quantitative case study research method on social science: Accounting perspective. Int. J. Eng. Econ. Manag. Eng. 2016, 10, 3849–3854. [Google Scholar]
  52. Yin, R. Case Study Research: Design and Methods; Sage: Thousand Oaks, CA, USA, 2009. [Google Scholar]
  53. Small, B.; Robson-Williams, M.; Payne, P.; Turner, J.; Robson-Williams, R.; Horita, A. Co-innovation and Integration and Implementation Sciences: Measuring their research impact—An examination of five New Zealand primary sector case studies. NJAS 2021. (In press) [Google Scholar]
  54. Arkesteijn, M.; van Mierlo, B.; Leeuwis, C. The need for reflexive evaluation approaches in development cooperation. Evaluation 2015, 21, 99–115. [Google Scholar] [CrossRef]
  55. Patton, M.Q. Developmental Evaluation: Applying Complex Concepts to Enhance Innovation and Use; The Guilford Press: London, UK, 2011. [Google Scholar]
  56. Robson, M.C. Technical Report to Support Water Quality and Water Quantity Limit Setting Process in Selwyn Waihora Catchment: Predicting Consequences of Future Scenarios: Overview Report; Environment Canterbury: Christchurch, New Zealand, 2014.
  57. Williams, R.H.; Brown, H.E.; Ford, R.; Lilburne, L.; Pinxterhuis, I.J.B.; Robson, M.C.; Snow, V.O.; Taylor, K.; von Pein, T. The Matrix of Good Management: Towards an understanding of farm systems, good management practice and nutrient losses in Canterbury. In Moving Farm Systems to Improved Nutrient Attenuation; Massey University: Palmerston North, New Zealand, 2015; p. 14. [Google Scholar]
  58. Williams, R.H.; Brown, H.E.; Ford, R.; Lilburne, L.; Pinxterhuis, I.J.B.; Robson, M.C.; Snow, V.O.; Taylor, K.; von Pein, T. The Matrix of Good Management: Defining good management practices and associated nutrient losses across primary industries. In Nutrient Management for the Farm, Catchment and Community; Massey University: Palmerston North, New Zealand, 2014; p. 8. [Google Scholar]
  59. Robson, M.C.; Brown, H.E.; Hume, E.; Lilburne, L.; McAuliffe, R.; Pinxterhuis, I.J.B.; Snow, V.O.; Williams, R.H.; B+LNZ; DEVELOPMENTMATTERS; et al. Overview Report—Canterbury Matrix of Good Management Project; Report no. R15/104; Environment Canterbury: Christchurch, New Zealand, 2015.
  60. Pinxterhuis, I.; Dirks, S.; Bewsell, D.; Edwards, P.; Brazendale, R.; Turner, J.A. Co-innovation to improve profit and environmental performance of dairy farm systems in New Zealand. Rural Ext. Innov. Syst. J. 2018, 14, 23–33. [Google Scholar]
  61. Srinivasan, M.; Bewsell, D.; Jongmans, C.; Elley, G. Just-in-case to justified irrigation: Applying co-innovation principles to irrigation water management. Outlook Agric. 2017, 46, 138–145. [Google Scholar] [CrossRef]
  62. Srinivasan, M.S.; Bewsell, D.; Jongmans, C.; Elley, G. Research idea to science for impact: Tracing the significant moments in an innovation based irrigation study. Agric. Water Manag. 2019, 212, 181–192. [Google Scholar] [CrossRef]
  63. Robson-Williams, M.; Small, B.; Robson-Williams, R. Designing transdisciplinary projects for collaborative policymaking: The Integration and Implementation Sciences framework as a tool for reflection. GAIA Ecol. Perspect. Sci. Soc. 2020, 29, 170–175. [Google Scholar] [CrossRef]
  64. Funtowicz, S.; Ravetz, J. Science for post-normal age. Futures 1993, 25, 20. [Google Scholar] [CrossRef]
  65. Berkes, F.; Colding, J.; Folke, C. Rediscovery of traditional ecological knowledge as adaptive management. Ecol. Appl. 2000, 10, 1251–1262. [Google Scholar] [CrossRef]
  66. Bergmann, M.; Jahn, J.; Knobloch, T.; Krohn, W.; Pohl, C.; Schramm, E. Methods for Trandisciplinary Research: A Primer for Practice; Campus Verlag: Frankfurt, Germany, 2012. [Google Scholar]
  67. Hansson, S.; Polk, M. Assessing the impact of transdisciplinary research: The usefulness of relevance, credibility, and legitimacy for understanding the link between process and impact. Res. Eval. 2018, 27, 132–144. [Google Scholar] [CrossRef]
  68. Newig, J.; Pahl-Wostl, C.; Sigel, K. The role of public participation in managing uncertainty in the implementation of the Water Framework Directive. Eur. Environ. 2005, 15, 333–343. [Google Scholar] [CrossRef]
  69. Robson-Williams, M.; Small, B.; Robson-Williams, R. A week in the life of a Transdisciplinary Research: Failures in research to support policy for water quality management in New Zealand’s South Island. In Interdisciplinary and Transdisciplinary Failures Lessons Learned from Cautionary Tales; Fam, D., O’Rourke, M., Eds.; Routledge: London, UK, 2020; pp. 131–146. [Google Scholar]
  70. Pohl, C. From science to policy through transdisciplinary research. Environ. Sci. Policy 2008, 11, 46–53. [Google Scholar] [CrossRef]
  71. Lemos, M.C.; Morehouse, B.J. The co-production of science and policy in integrated climate assessments. Glob. Environ. Chang. 2005, 15, 57–68. [Google Scholar] [CrossRef]
  72. Sundin, A. Make Your Science Sticky-Storytelling as a Science Communication Tool. Stockholm Environment Institute. Available online: https://www.sei.org/perspectives/make-science-sticky-storytelling-science-communication-tool/ (accessed on 27 June 2019).
  73. Torres, D.H.; Pruim, D.E. Scientific storytelling: A narrative strategy for scientific communicators. Commun. Teach. 2019, 33, 107–111. [Google Scholar] [CrossRef]
  74. Cash, D.W.; Clark, W.C.; Alcock, F.; Dickson, N.M.; Eckley, N.; Guston, D.H.; Jäger, J.; Mitchell, R.B. Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA 2003, 100, 8086–8091. [Google Scholar] [CrossRef] [Green Version]
  75. Midgley, G. Systemic Intervention: Philosophy, Methodology, and Practice; Kluwer/Plenum: New York, NY, USA, 2000. [Google Scholar]
  76. Boyd, A.; Brown, M.; Midgley, G. Systemic Intervention for Community OR: Developing Services with Young People (under 16) Living on the Streets; Kluwer/Plenum: New York, NY, USA, 2004. [Google Scholar]
  77. Van Mierlo, B.; Arkesteijn, M.; Leeuwis, C. Enhancing the Reflexivity of System Innovation Projects with System Analyses. Am. J. Eval. 2010, 31, 143–161. [Google Scholar] [CrossRef] [Green Version]
  78. Klerkx, L.; Aarts, N.; Leeuwis, C. Adaptive management in agricultural innovation systems: The interactions between innovation networks and their environment. Agric. Syst. 2010, 103, 390–400. [Google Scholar] [CrossRef]
  79. Dilling, L.; Lemos, M.C. Creating usable science: Opportunities and constraints for climate knowledge use and their implications for science policy. Glob. Environ. Chang. 2011, 21, 680–689. [Google Scholar] [CrossRef]
  80. Cheruvelil, K.S.; Soranno, P.A.; Weathers, K.C.; Hanson, P.C.; Goring, S.J.; Filstrup, C.T.; Read, E.K. Creating and maintaining high-performing collaborative research teams: The importance of diversity and interpersonal skills. Front. Ecol. Environ. 2014, 12, 31–38. [Google Scholar] [CrossRef] [Green Version]
Figure 1. A summary of the three domains and five questions of the Integration and Implementation Science framework.
Figure 1. A summary of the three domains and five questions of the Integration and Implementation Science framework.
Sustainability 13 05491 g001
Figure 2. Methodological steps for testing hypothesis. Methods for each step described in experimental procedures section. The question numbers refer to the questions in the i2S framework.
Figure 2. Methodological steps for testing hypothesis. Methods for each step described in experimental procedures section. The question numbers refer to the questions in the i2S framework.
Sustainability 13 05491 g002
Figure 3. Relationship between consideration of i2S elements (domain level) and usefulness of research process and outputs for seven case studies.
Figure 3. Relationship between consideration of i2S elements (domain level) and usefulness of research process and outputs for seven case studies.
Sustainability 13 05491 g003
Table 1. A brief description of the seven case studies.
Table 1. A brief description of the seven case studies.
Case Study NumberCase Study NameCase Study Project AimReferences
1Selwyn WaihoraTo support and inform a collaborative policy process in setting water quality and quantity limits in the Selwyn Waihora catchment.[47,56]
2Matrix of Good ManagementTo define primary industry-agreed good management practices and model the nutrient losses from farms operating using good management practice.[57,58,59]
3Nutrient ManagementTo develop and test on farm practices to help farmers to comply with changing and increasingly stringent regional water quality regulations.[60]
4Log SegregationTo develop cost-effective approaches to characterise and deal with variation in wood properties within and between trees to enhance value-added production.[45]
5Heifer RearingTo improve the reproductive performance of New Zealand’s dairy herd by lifting the proportion of heifers entering the national herd at target live weight.[60]
6Water Use EfficiencyTo improve on-farm irrigation decisions using better characterisation of irrigation demand and accurate short-term weather forecasts.[61,62]
7Primary InnovationTo gain greater economic benefit and a more sustainable future from the performance of New Zealand’s primary industries, including science through the use of Agricultural Innovation Systems.https://www.beyondresults.co.nz/primary-innovation/about/ (accessed on 30 April 2021)
Table 2. Case studies and next user assessments.
Table 2. Case studies and next user assessments.
Case Study No.Case StudyMethod for Collecting Data from Next UsersNumber of Next Users Role of Next Users
1Selwyn WaihoraInterview5Decision-maker, planner, and policy-maker
2Matrix of Good ManagementInterview6Industry representative, planner, and policy-maker
3Nutrient ManagementSurvey 5Policy-maker, industry representatives, and farmers
4Log SeparationSurvey2Industry representatives
5Heifer RearingSurvey5Industry representatives, farmers
6Water Use EfficiencySurvey4Policy-maker, industry representative, and farmers
7Primary InnovationSurvey5Policy-makers, industry representatives, and researcher
Table 3. Consideration of i2S elements and usefulness of research process and outcomes by next users for seven case studies, scored on a 0–4 scale.
Table 3. Consideration of i2S elements and usefulness of research process and outcomes by next users for seven case studies, scored on a 0–4 scale.
Assessed Consideration of i2S ElementsAssessed Usefulness by Next Users
Case study 1: Domain 1 3.303.2
Case study 1: Domain 22.702.5
Case study 1: Domain 3 3.343.2
Case study 1: all frameworks3.13.0
Case study 2: Domain 1 3.733.5
Case study 2: Domain 22.632.8
Case study 2: Domain 3 3.333.2
Case study 2: all frameworks3.23.2
Case study 3: Domain 1 3.103.4
Case study 3: Domain 22.903.5
Case study 3: Domain 3 3.303.5
Case study 3: all frameworks3.13.5
Case study 4: Domain 1 2.683.0
Case study 4: Domain 22.302.8
Case study 4: Domain 3 2.653.0
Case study 4: all frameworks2.52.9
Case study 5: Domain 1 2.703.0
Case study 5: Domain 21.671.8
Case study 5: Domain 3 2.602.9
Case study 5: all frameworks2.32.5
Case study 6: Domain 1 3.043.3
Case study 6: Domain 22.213.2
Case study 6: Domain 3 3.353.2
Case study 6: all frameworks2.93.2
Case study 7: Domain 1 3.253.2
Case study 7: Domain 23.193.4
Case study 7: Domain 3 3.613.4
Case study 7: all frameworks3.43.3
Table 4. Correlations between case study consideration of i2S elements and assessed usefulness of research process and out puts for seven case studies.
Table 4. Correlations between case study consideration of i2S elements and assessed usefulness of research process and out puts for seven case studies.
VariablesCorrelations
Consideration of i2S elements: all frameworksAssessed usefulness by next users: all domains0.79 (p < 0.001)
Consideration of i2S elements: Domain 1Assessed usefulness by next users: Domain 10.84 (p = 0.018)
Consideration of i2S elements: Domain 2Assessed usefulness by next users: Domain 20.78 (p = 0.039)
Consideration of i2S elements: Domain 3Assessed usefulness by next users: Domain 30.81 (p = 0.027)
Table 5. Correlation between case study consideration of i2S elements and assessed overall usefulness of research process and outputs for seven case studies.
Table 5. Correlation between case study consideration of i2S elements and assessed overall usefulness of research process and outputs for seven case studies.
VariablesCorrelations
Consideration of i2S elements: all frameworkAssessed usefulness by next users: overall assessment0.76 (p = 0.045)
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Robson-Williams, M.; Small, B.; Robson-Williams, R.; Kirk, N. Handrails through the Swamp? A Pilot to Test the Integration and Implementation Science Framework in Complex Real-World Research. Sustainability 2021, 13, 5491. https://doi.org/10.3390/su13105491

AMA Style

Robson-Williams M, Small B, Robson-Williams R, Kirk N. Handrails through the Swamp? A Pilot to Test the Integration and Implementation Science Framework in Complex Real-World Research. Sustainability. 2021; 13(10):5491. https://doi.org/10.3390/su13105491

Chicago/Turabian Style

Robson-Williams, Melissa, Bruce Small, Roger Robson-Williams, and Nick Kirk. 2021. "Handrails through the Swamp? A Pilot to Test the Integration and Implementation Science Framework in Complex Real-World Research" Sustainability 13, no. 10: 5491. https://doi.org/10.3390/su13105491

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop