Next Article in Journal
A Comprehensive Analysis of the Worst Cybersecurity Vulnerabilities in Latin America
Previous Article in Journal
Theoretical Models for Acceptance of Human Implantable Technologies: A Narrative Review
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Essay

What Is It Like to Make a Prototype? Practitioner Reflections on the Intersection of User Experience and Digital Humanities/Social Sciences during the Design and Delivery of the “Getting to Mount Resilience” Prototype

1
Environmental Informatics Group, Environment Business Unit, Commonwealth Science and Industrial Research Organisation (CSIRO), Canberra, ACT 2601, Australia
2
Justice and Technoscience Lab (JusTech), School of Regulation and Global Governance, College of Arts and Social Sciences, The Australian National University, Canberra, ACT 2601, Australia
Informatics 2023, 10(3), 70; https://doi.org/10.3390/informatics10030070
Submission received: 17 May 2023 / Revised: 10 July 2023 / Accepted: 17 August 2023 / Published: 28 August 2023
(This article belongs to the Section Social Informatics and Digital Humanities)

Abstract

:
The digital humanities and social sciences are critical for addressing societal challenges such as climate change and disaster risk reduction. One way in which the digital humanities and social sciences add value, particularly in an increasingly digitised society, is by engaging different communities through digital services and products. Alongside this observation, the field of user experience (UX) has also become popular in industrial settings. UX specifically concerns designing and developing digital products and solutions, and, while it is popular in business and other academic domains, there is disquiet in the digital humanities/social sciences towards UX and a general lack of engagement. This paper shares the reflections and insights of a digital humanities/social science practitioner working on a UX project to build a prototype demonstrator for disaster risk reduction. Insights come from formal developmental and participatory evaluation activities, as well as qualitative self-reflection. The paper identifies lessons learnt, noting challenges experienced—including feelings of uncertainty and platform dependency—and reflects on the hesitancy practitioners may have and potential barriers in participation between UX and the digital humanities/social science. It concludes that digital humanities/social science practitioners have few skill barriers and offer a valued perspective, but unclear opportunities for critical engagement may present a barrier.

1. Introduction

The digital humanities and social sciences have an important part to play in addressing the challenges of climate change and disaster risk reduction (DRR). Meaningful progress on climate change cannot be made without cultural change [1] and scholarship that engages the deeply human aspects of this crisis [2]. Practically speaking, the digital humanities and social sciences offer innovative ways of engaging different communities and groups in conversations on climate and disasters. This can be seen in developments such as citizen science that use digital technologies to facilitate community participation in scientific endeavours which might benefit the environment [3]. Citizen science projects illustrate how biophysical sciences, as well as the digital humanities and social sciences, increasingly engage one another through digitally enabled products, services, and spaces to address contemporary problems.
Citizen science, however, is not the only example of digital services and products dedicated to community or user engagement. In recent years, the field of user experience (or UX) has become increasingly popular in the private sector. The phrase UX is often used in a variety of ways. It can denote the field in general (for instance, I might refer to myself as a UX professional), the practice of performing this work (for example, I might be performing UX work on a specific project), or the specifics of what user’s experience (for instance, I might ask a colleague what the UX of my software was). The technical definition of UX though refers to “technolog[ies] that fulfil more than just instrumental needs in a way that acknowledges its use as a subjective, situated, complex and dynamic encounter” [4] (p. 96). This commonly refers to digitally mediated technologies, products, and other experiences that provide a user with an aesthetically pleasing, affectively engaging, experientially unique, and technically functional experience [4] (p. 95). For example, the popular language-learning application Duolingo provides language education through a digitally gamified user experience, wrapping learning content with aesthetically pleasing interfaces and responsive reward systems to encourage user participation [5]. Duolingo is popular not just because of its function but also because it provides a positive user experience to this functional need. The assumption behind private sector demand is that well-designed UX creates sustained engagement and value as part of a data-driven business model [6].
However, the intersection of UX and business has raised ethical concerns. Dourish [7] notes that UX and supporting disciplines such as Human–Computer Interaction (HCI) have potentially become caught in a “legitimacy trap”. By becoming closely associated with corporate, business, and profit-driven endeavours, UX/HCI’s ability to offer meaningful ethical or normative critiques has become limited, while simultaneously building the acceptability of these more limited applications. This position is not helped by the fact that UX can be used with ill-intent. For instance, researchers have noted the existence of “dark patterns” in UX—specific interactive relationships realised through UX that place shareholder value over user value and human dignity [8,9]. Using cognitive biases, user-interface design quirks and other underhanded techniques, dark patterns create a user experience that wilfully and knowingly exploits users for the benefit of corporate or for-profit interests [10,11]. While aware of these issues, UX practitioners feel that their ability to act ethically is limited, despite a desire to do so [12]. Such observations do not suggest that the totality of UX knowledge and practice is being used for ill-intent; instead, they simply mean that UX is a contested territory of knowledge and practice and that not all opportunities for UX to influence society are inherently positive.
There are still many opportunities for common-good outcomes from UX, despite these potentially contested and bad-faith uses. UX can be an avenue by which complex interdisciplinary problems are engaged, with UX knowledge, practice, and outputs creating new and productive relationships between digital services/products and domain science (i.e., climate change and climate adaptation underpinned and enabled by the digital humanities/social science). As Rudinki [13] points out, UX might be viewed as a translational social science that creatively and innovatively uses approaches from social science and the digital humanities to engage a wide group of people in the co-production of knowledge. The discipline of design itself is also heavily connected with the digital humanities and social sciences [14,15], and design is heavily connected with modern developments in UX [16]. With this being said, however, Rudinki [13] points out the significant disquiet many social scientists feel towards UX and its potential to represent a softening of the critical inquiry that defines much socialscience knowledge, a concern also raised by Dourish [7] earlier.
This disquiet may in part explain why we do not see more interdisciplinary collaboration between UX and the digital humanities/social sciences for addressing specific problems. For example, consider again the area of climate change and DRR. The main examples of collaboration between UX, the digital humanities/social science, and climate science here can be observed within disciplines such as Sustainable Human–Computer Interactions (S-HCIs). S-HCI focuses on connecting HCI methods and theories with climate and sustainability issues [17,18] and, through HCI-informed experimentation and design, makes these public science issues more “visible and actionable” to society [19]. Through this connection, sustainability might occur as a result of using HCI methods (e.g., through designing and implementing energy dashboards that promote more sustainable energy use), or as a result of making HCI itself more sustainable (e.g., building awareness of environmental constrains and impacts of computing infrastructure) [17]. S-HCI shares many methodologies and practitioners with UX work [20]. S-HCI also builds on HCI’s diverse intellectual background, which includes the digital humanities and social sciences [21,22] and offers a glimpse into how UX (and related products and services) is being deployed to support climate and sustainability goals, using this interdisciplinary knowledge. It is therefore surprising that, despite robust academic development, S-HCI has relatively few successful climate-change tools. A recent study found only 40 examples of S-HCI tools on climate change and none on climate adaption [19]. While not an exhaustive study, it is interesting to note this relative lack of successful examples, given the popularity of UX.
While it is beyond the scope of this paper to identify and address every potential contribution to this disparity, it is a reasonable assumption that, as Rudinki [13] stated, there are barriers, ambiguities, and complications at play that prevent more successful interdisciplinary combinations of the digital humanities/social sciences, UX, and domain science (in the case of this paper, climate change/DRR).
It is the purpose of this paper to share the initial insights and lessons that I gleaned while participating in the development of a UX product for climate adaptation/DRR. I occupy a unique position that touches on several of the above threads: I have an academic background in digital social sciences and humanities; I work as a practitioner with an interdisciplinary team of biophysical and social scientists; and, as part of this work, I was asked to collaborate with designers and UX professionals to create a UX product. This offers me a situated perspective on the experience of working with UX and the chance to contribute to the conversations Rudinki [13] identifies. The goal of this paper is to stimulate conversation and reflection on the experience of this work and reflect on why those of us in the digital humanities/social science space might feel this disquiet. It is not the purpose of this paper to offer a defining normative stance on the value of the digital humanities and social science to UX, or vice versa, although this theme is certainly present in the discussions. Instead, in the tradition of classic sociological inquiry [23], it is to explore the relationship between private troubles/experiences (the experience of UX work through my reflective lens) and public issues (the implications of UX work) as part of initiating a conversation about why collaboration between the digital humanities/social sciences, UX, and domain areas is not more prominent.
I believe that this is important and relevant in several ways. First, as previously referenced, addressing critical challenges (i.e., climate change and DRR) in society requires the integration of the digital humanities and social sciences in the context of an increasingly digitalised society. It is important to be a part of this conversation if we are to contribute, especially given that the social sciences and humanities are going to be central to addressing many of the issues associated with these challenges [1,2,24,25]. Second, the reality is that UX is extremely popular in the private sector and a likely career option for the many digital humanities/social science professionals who do not find academic positions. Given that UX is sometimes not clearly understood, providing the digital humanities/social science community with some perspective of this work might be of value to those considering it. Third, I believe that the social sciences and digital humanities make great contributions to human endeavours; these contributions should be valued, considered, and shared across many aspects of human endeavour. The social sciences and digital humanities are integral to a human project that is emancipatory, just, civil, and free, as they provide the theoretical and practical tools to hold critical inquiry, meaningful engagements, and to imagine a better future [26,27,28]. Given the contested nature of both society at large and UX specifically, the digital humanities and social sciences stand to make great contributions to societal challenges if collaboration can occur.
With these rationales in mind, the paper presents initial reflections from my experience working on a UX project for climate change and DRR. Over the course of this work, there were several opportunities for reflection provided, and I present high-level insights developed from these reflections. In doing this, I identify two major challenges I experienced through this work: (1) uncertainty about emotions and work practices and (2) platform dependency. I provide advice to other practitioners on addressing these challenges, using thought experiments to explore alternatives and share solutions. I ground these thought experiments in specific examples of practice that I experienced in my work, noting alternative paths and options that hindsight has given me. I close my reflections with discussion on Rudinki’s [13] work vis-à-vis my experience and on the assumptions behind potential barriers and complications that might be hindering collaboration between the social sciences/digital humanities and UX. I admit that barriers may exist—particularly around the capacity for normative critique (building on what Dourish [7] raises)—but contend that concerns around the capacity and value of the social sciences/digital humanities to engage with the domain of UX do not appear to be a barrier (at least in my experience). Indeed, the social sciences/digital humanities were fundamental to my contributions to UX work, and social scientists and digital humanities practitioners should feel able and empowered to contribute to UX, if they so wish.

2. Background

Between 2020 and 2021, I was part of an interdisciplinary team that developed resilient financial investment instruments for climate adaptation. This team, the Enabling Resilience Investment (ERI) team, was a collaboration between scientists at the Commonwealth Science and Industrial Research Organisation (CSIRO) and private-sector infrastructure consultants Value Advisory Partners (VAP). The project brought together an interdisciplinary group of scientists with expertise in infrastructure consulting to develop a set of tools and frameworks for assisting decision-makers in making investment decisions that were climate resilient. This work responds to a growing need for infrastructure investment aligned with climate-adaptation principles [29,30]. However, such investments are often more expensive and require additional justification for decision-makers. The ERI toolset aimed to support decision-makers through this process with an end-to-end package of financial, planning, and engagements tools [31].
Because of the complex nature of these tools and the multiple potential audiences in government and industry, there was a need for an engagement approach that clearly introduced the toolset and explained where and how it might be used. To achieve this, a UX demonstrator was to be built. Similar to an interactive webpage or small software prototype, the demonstrator would aim to provide high-level walkthroughs for different target audiences, while also providing more detailed explanations and advice for anyone interested.
To maximise impact, the demonstrator piggybacked off “Big Weather”, a recent special event aired by the Australian national broadcaster [32]. This special featured an augmented reality (AR) experience called “Mount Resilience”, which described the effects of extreme weather and climate risk by using a fictional town, the titular Mount Resilience. Users of the AR experience could explore what a town might practically do to address extreme weather events—such as fireproofing homes, installing off-grid electricity, and using natural green space as fire/water breaks. To invite potential users to engage with the ERI toolset in a low-stake, non-judgemental, and playful way, the demonstrator was envisioned as a prequel to “Mount Resilience” that explored the planning, decision-making, and financial steps necessary to fund a climate-adapted town. Thus, the demonstrator was named “Getting to Mount Resilience” (GtMR).
To create the demonstrator, a small design sub-team was formed (henceforth called the design team), with support from the larger group. The design team consisted of a VAP colleague and myself, and we completed the conceptual and content development of the prototype. We also had a project sponsor in the broader ERI team to support us. To facilitate the design and delivery process, external design consultants were brought into the design team to provide the underlying architecture and visual language (see [33] for an example of their approach). The demonstrator was built in Adobe XD by our consultant colleagues and further iterated by the design team. The product was completed at the close of financial year 2020–2021.

3. Materials and Methods

The insights and lessons learnt described in this paper were developed through engagement with a variety of developmental and reflective methods. These interactions began with a formal group reflection and learning process and continued through self-reflection and further research, culminating in the paper I now present.
Formal reflection and learning occurred sometime after the delivery of the GtMR prototype, as the team undertook an evaluation process to canvas team-member experiences and future options for the GtMR project. This reflection and learning process used a combination of approaches. A developmental evaluation approach [34] formed the core methods. Developmental evaluation is an evaluation approach for social-change initiatives occurring within complex environments. The approach emphasises the use of dynamic reflection and learning as a part of ongoing project activities, supporting the growth and development of the project as it occurs. Our team drew on Lazarow and colleagues’ [35] interpretation of developmental evaluation, particularly the use of structured reflection workshops, to capture insights and reflections. We adopted the six-stage process for reflection [35] for our workshop. We generalised the overarching questions and approach but focused our areas of reflection around the design activities used to build the GtMR demonstrator. See Table 1 for generalised methodology and questions based on Lazarow and colleagues’ [35] work.
By asking team members to reflect on their participation in this work, the evaluation evoked aspects of participatory evaluation, in that it invited those completing work into the assessment space to question their experience and the overall project in a broad and inclusive way [36,37]. This approach has been demonstrated to improve self-efficacy, empathy, and critical awareness (alongside other outcomes) among participants [36].
Data were collected using the online whiteboarding service Miro and analysed using a qualitative clustering and thematic analysis method (see [38,39,40]). Individual thoughts and comments were clustered into descriptive codes before being further grouped and connected with emergent themes and other descriptions. Each clustering and connection sought to move from descriptive codes to themes and categories, which provided explanations or insights into what was described.
Due to external circumstances beyond team members’ control, opportunities to enact learnings from this project were minimal, and no further development of GtMR occurred, although it continues to be used as a reference point within existing ERI work.
While this marked the formal end of the project, I continued to reflect on my experiences of the project. The experience of participatory evaluation provided a space for me to engage with my experiences and thoughts, which, as I note later, did involve some emotional and personal challenges. Over time, with further professional experience and academic research, these reflections gradually evolved into insights and lessons learnt, which I present here. In this way, there is an aspect of qualitative self-reflection that has been drawn into the evaluation. Qualitative self-reflections are used to allow an individual to see the connections between individual practices and experiences against broader contexts and understand the intersections of these to inform their future practice [41]. This includes considering how the characteristics of the practitioner and their internal states have contributed to their actions and any resulting artefacts [42,43]. The formal developmental and participatory evaluation mechanisms began these reflections, which continued over time.
It is therefore important to foreground that the lessons learnt and the insights shared in this paper are based on this milieu of personal reflection and learning, which has drawn inspiration from, but does not represent, the rest of the team or the broader dataset created through the formal developmental evaluation process. It is not the purpose of this paper to present qualitative and quantitative data from the developmental and participatory evaluation. Instead, these data and my perspective are leveraged together to improve the rigour and robustness of my practitioner’s perspective, with the goal of supporting other practitioners and contributing to emerging issues in our field.

4. Lessons Learnt and Future Advice

Reflecting on this learning process, I identify two main challenges that impacted development and delivery and might have been avoided. I believe signposting and forewarning these challenges for other professionals will improve their development and delivery experience and lower the transaction cost of pursuing UX options for science communication. Further, I believe that reflecting on these experiences will offer insights into the broader relationship between UX and the social sciences/digital humanities.

4.1. The Experience of Uncertainty

Surprisingly, I found progressing elements of this project difficult because of the affective nature of this work and sense of personal and professional uncertainty the work evoked. This presented on multiple levels across the design and delivery process. Aspects of uncertainty included the nature of my relationship with our consultant colleagues and if they could be trusted (although trust developed over time), uncertainty about the output expected by ERI project leaders and broader stakeholders, and a personal lack of familiarity with the human-centred design methodology the consultants employed. The management literature illustrates that uncertainty has a multiplicity of adverse impacts on projects and organisations [44]. Further to this, Critical Data Studies illustrate how emotions can be an underappreciated aspect of “data journeys” [45] and data practice [46], of which software development is an example [47]. Emotional states can contribute to reflexive practices of work that define data and data systems [46]. Put simply, emotions colour how we work, how the data product or service comes to be constituted, and, finally, how the product manifests in different contexts across its development lifecycle. My reflections made me wonder whether choices made in the design process occurred because they were the right choices and aligned with our goals or because my experience of uncertainty moved the team towards pathways that afforded more certainty and comfort. I am not suggesting that the quality of the product changed because of this, simply that emotional experiences contributed to how I engaged with the development and delivery process. This caused unnecessary professional delays, confusion, and personal discomfort as a consequence.
If I were to participate in another UX design and delivery project, I would advocate for a more structured intellectual- and emotional-engagement process that features dedicated space and time to engage with these issues and address feelings of uncertainty. The personal uncertainty I felt in working with human-centred design methods has close parallels to some of the intellectual frictions present in how interdisciplinary and transdisciplinary sciences are developed [48,49,50], where scientists involved in creating innovative forms of sciences felt torn between a desire to participate and their existing personal and institutional perspectives [48]. In this research, significant thought was given to building the competencies necessary to develop transdisciplinary knowledge and, therefore, the intellectual security necessary to work productively in this space [51]. These competencies include individual, collective, cognitive, and contextual elements that transdisciplinary teams might consider in building their capabilities [51]. It is far beyond the scope of this paper to provide a detailed review or recommendations from the literature. However, we might learn from some of the clearest and simplest of lessons. In a review of transdisciplinary learning vocabularies, Klein [52] notes that many of the key terms and ideas in the field concern collaboration, collective effort, and building consensus among team members as new knowledge and skills are created together. During the GtMR project, we never made a concerted effort to learn each other’s skills or practices or build our collaborative capital outside of the project itself. We did not engage in any structured or unstructured learning to contextualise what we all brought as experts in our own right, or how our expertise contributed to how we worked and felt.
Given examples from transdisciplinary science, I believe that intellectual uncertainties might be addressed through formal and informal collaborative learning opportunities on these projects. Teams might allocate two hours a week—one for formal presentations and another for informal “play” (more on this momentarily)—to learn more about each other’s skills, how they are working together, and what their work will look like. Formal presentations might present the history, background, and theory of our respective expertise, all topics that can be presented with a minimum of development time, given the expertise of the team. Informal sessions might draw upon “play” methods [53,54,55]—creative sessions that allow participants to experiment and learn skills in a low-stakes and enjoyable way, while still being relevant to the work. For example, UX work often utilises sketches or “low-fi prototype” to test and storyboard ideas. Getting team members to collectively sketch stories to learn this method would be an easy way of building knowledge and collaborative trust. These (in)formal approaches fulfil Decupyer and colleagues’ [56] criteria for effective team learning as well, in that they provide opportunities for the acquiring of skills and information (sharing information through presentation) and participation and creation (through team members co-creating prototypes or other methods through play).
Considering the emotional and affective uncertainties I felt, I believe attaching dedicated emotional reflection or debriefing time to critical decision points or milestones would have allowed structured opportunities to reflect on any affective states (such as uncertainty) that we may have been feeling in our work. Through this reflection, we might have opportunities to understand how this is (or is not) influencing our work and decision-making. Contemporary project-management methodologies already feature meetings dedicated to assessing the progress of work, whether they be called milestones, gateways, or decision points. In agile methodologies (a project management approach often associated with computer programming and design; see Fernandez and Fernandez [57]), daily “stand-ups” are instituted to identify and address blockages in work on a daily basis. It would take very little effort to add an agenda item to raise any affective or emotional issues that might be influencing the work being undertaken. This might simply be an unstructured group conversation. Alternatively, there are practical checklists available that have been developed to allow for structured conversations on how decisions-makers and leaders are engaging with their emotions during decisions [58,59]. Some groups of professionals such as investors [60] have also benefited from using structured mental techniques, such as envisioning scenarios and mental distancing, to improve their decision-making. These opportunities for the (un)structured sharing of emotions—even through simple social talk—have the ability to contribute to intragroup relationships [61] (i.e., teams) and contribute to the broader success (or failure) of a workplace as part of the “organisational climate” where emotions are shared [62].
Spending time reflecting on the affective and emotional elements of the work experience is a more abstract proposition for many teams, and there are also many other psychosocial variables that influence a workplace (see [62] for more). However, research has noted that interpersonal competencies play a role in how teams operate and work [63], including in software development [64]. It is also the reality of my experience that emotion was an important part of this project. With the gift of hindsight, we can see how these variables might influence our work and reflect on how we might better consider emotions as a part of UX practice. If we can develop a greater awareness of emotions, as individual team members and as a group, we are likely to feel more collaborative and positive about the work we are undertaking. Further, at a scientific level, this awareness could also be leveraged with insights from Critical Data Studies on how emotions can influence work with data (see [46]). This would produce a virtuous cycle of not just improving how teams work in development but also of raising awareness about the broader implications and qualities of the software they are working with—improving both delivery and long-term impact.

4.2. Platform Dependency

In the GtMR project, I believe that the design and delivery process faced unnecessary self-imposed technical challenges, as the design team fell victim to what I call “platform dependency”. Platform dependency is similar to path dependency, where previous decisions constrain future decisions (see [65]). The initial choice concerning which platform to build the demonstrator on became a constraint to the entire design and delivery experience. This experience of constraint was interesting because it was sociotechnical and concerned the intersection of situated work practices with broader assemblages of sociotechnical systems, institutions, and stakeholders.
Designing and delivering UX solutions requires specialised software. Early on in the journey, the team needed to determine which software solution to pursue. Consultants identified Microsoft PowerPoint and Adobe XD as options. PowerPoint was widely adopted but lacked many features and has rarely been used to provide UX solutions. PowerPoint represented an ad hoc solution to the team’s needs. Adobe XD and the Adobe Creative Suite cloud environment represented the industry standard. Adobe products, however, presented a learning curve for all team members, but it was hoped that their features would provide greater flexibility. While the technical merits of each product were clear, our decision was also shaped by social elements such as the needs of our organisation, as other teams were simultaneously working on an ecosystem of digital products and were keen for potential contributions. This flagged a sociotechnical need to be strategically and technically interoperable with our organisation. In addition, considering the needs of potential stakeholders and customers and how best to engage their needs in our product was also flagged as important. To align with these different social and technical requirements, a decision was made to use the Adobe product.
This decision, however, proved to be problematic. As Rahman and Thelen [66] observe, the success of platforms in contemporary digital marketplaces is in part thanks to their “winner takes all” attitude and the creation of what some describe as “walled gardens” [67] that restrict access and only benefit paying customers. The walled garden metaphor is apt, as it illustrates some of the consequences I experienced. Once I began completing our work in the exclusive walled garden of Adobe’s environment, I found myself isolated from other sources of technical support. The institutional and technical support that my home organisation usually supplied was limited due to incompatibility between the contracting arrangements the design team had been required to enter into. This meant I could not receive reliable technical support over the course of design, causing significant stress when technical errors occurred. Further, walled gardens often prevent external viewers from gathering the information they need to make an appropriate decision—a part of the information asymmetry that platforms have also incorporated into their business model [68]. As I worked through tasks in the Adobe environment, I discovered that basic features I had assumed would be included—such as the capacity to run “find and replace” text searches or export HTML—were not present. This increased my workload. Another challenge of the walled garden is the platform’s approach to keeping users by making the exit cost high [69], as Adobe does through operating in a proprietary format. Choosing Adobe products—with high hopes for a flexible solution to the team’s needs and those of our stakeholders—proved costly, as our decisions and operations along the UX journey were constrained by this choice.
The clear lesson for anyone entering into UX projects or any software solution is to avoid platform dependency. This might seem easier said than done, given the enormous scope of digital platforms such as those owned by Adobe. After all, what can one team or individual do against a multibillion-dollar corporation? I do not claim that individuals or teams have an easy path forward, but they are also not without options. Indeed, these options come from the unique knowledge that the digital humanities and social sciences provide to practitioners. Given my own experience and disciplinary knowledge, I make the following suggestions to address platform dependency.
First, challenge the information asymmetry that empowers platform business and disempowers practitioners. As noted earlier, while UX solutions are uncommon for social scientists, there are a dearth of other fields and practitioners in the private sector in this space. The internet is also filled with freelancers, advice websites, discussion groups, and other sources of information on UX software and process, many of whom have spent time within the walled gardens of different software suites. Spending time in these communities and establishing prior knowledge about the realities of using software, platforms, and other proprietary aspects of UX would help counteract the information asymmetry that platforms rely on, putting practitioners in a better position to choose software and platforms that meet their own needs and those of their stakeholders. In her analysis of exploitative regimes of “surveillance capitalism”, Zuboff [70] notes that part of the success and power of this form of political–economic structuration is in convincing society as a whole that such structures are inevitable logical conclusions. Zuboff contends that these are careful constructions funded through vested interests and encourages us to reject this mental model. This allows for new intellectual and practical spaces to be considered. Expanding our knowledge of UX, its tools, and nuances can help begin this process of countering information asymmetry by making us aware of the limits, opportunities, and alternatives within a domain we may not be familiar with. I had limited knowledge of UX when I began the GtMR project and accepted the status quo of UX tools rather than researching them. I will begin my next project with at least a day or two of dedicated research, conversations with members of the UX community (like my former consultant colleagues), and other investigations to understand how platform decisions might impact the next project and what practical opportunities I may (or may not) have.
Second, information collected through this research process can be used as the foundation for reflections and insights into the UX workflows being employed. The first recommendation is perhaps a bit nebulous; this second step clarifies it by taking this knowledge and applying it in order to consider how we might ameliorate issues of platform dependency. Specifically, we might consider opportunities and vulnerabilities in how our workflows are sociotechnically interoperable. By interoperable, I refer to the ability of different social and technical systems to connect functionally with one another in the performance of tasks. Interoperability is often referred to in a technical sense in discussions on platform architectures, datasets, and other technical systems [71,72,73,74]. However, I would expand this understanding to consider the different social systems and actors involved in a workflow or social interaction. The members of a design team could consider how they complete their work and what tools, skills, practices, and outputs they require to complete their project. They could also reflect on ways to ensure that they remain socially and technically interoperable to avoid platform dependency.
For example, in the GtMR project, one of the key issues we faced was Adobe XD’s inability to export to HTML. This prevented us from publishing the prototype easily ourselves, as the proprietary format XD saves in meant that we could not easily switch to another software solution to do this. We were locked into a walled garden at this point. Consider an alternative scenario where, in our project planning, we identified a critical need to deliver a prototype that was technically and socially interoperable—i.e., it had the ability to move between different team members easily and be worked on in different technical environments without limitations. With this in mind, and prior knowledge from step one, we might have eliminated creating a prototype in XD in our workflow because of known interoperability issues. Traditional UX practice calls for a “high fidelity” prototype (such as the one originally developed in Adobe XD for GtMR) in order to more fully test how a realised version of the digital product/service works with stakeholders [75]. Instead of this, we could have invested more heavily in “low fidelity” prototypes made of paper, text, and pictures and recruited a web developer who could have worked with us on directly translating these into HTML. This process would have maintained technical interoperability (as the developer would have directly written the prototype using common HTML rather than a proprietary format, and paper is highly interoperable between team members), while also allowing social interoperability (the team could communicate ideas in a more natural, generative way through conversation, graphics, and texts, not restricted by the platform).
By understanding UX tasks better, critically reflecting on how we work and adapting our work processes to ensure outputs are socially and technically interoperable, we can avoid having ourselves and our UX product caught in a walled garden. This is not to say that such reinventions are easy or without risk: it may be more costly to bring on a dedicated web developer; making iterations on the product would be harder in HTML than dedicated tools such Adobe XD, and low-fidelity prototyping might not capture all the features and details needed to bring a UX product to fruition. The point is, however, that it is possible to envision future practices and approaches that might avoid some aspects of platform dependency if we both reject the inevitability of platforms and are creative in how we work.

5. Concluding Reflections

My experience of designing and delivering a UX product for climate adaptation was a challenging but not unrewarding exercise. Reflecting on this experience, I now return to the core premise of the paper: how my personal experience might help us reflect more broadly on why there are not more collaborations between the digital humanities/social sciences, UX, and domain science areas. Thus, I offer the following concluding remarks.
First, reflecting on my assumptions concerning Rudinki’s [13] analysis and the belief that there are barriers and complications that prevent greater collaboration, I would submit that while there may indeed be barriers at play, my experience has suggested that some obvious ones are not an issue. For instance, the assumption that there are fundamental intellectual disagreements or incompatibilities between social sciences and UX that make collaboration difficult was not borne out in my experience. Indeed, my ability to give practical advice comes largely from my social science knowledge (more on this below). The issue was more that we did not create ways of seeing, valuing, and understanding the different knowledge and practices we brought, which contributed to the state of emotional uncertainty described earlier. The issue was collaboration and communication more than knowledge and expertise. Indeed, I feel like those with a background in the digital humanities and social sciences are well equipped to intellectually and practically perform the work of UX. From the need to run requirement interviews to performing low-fi prototyping and creative writing work (all of which feature in the UX design lifecycle to some extent), social scientists and digital humanities practitioners have the skills and knowledge to accomplish these tasks.
Beyond the practical skills to perform these tasks, social scientists and digital humanities practitioners have the ability to perform them in a way that creates additional value and insight, something I believe is the special gift of this domain. As stated earlier, it is not the purpose of this paper or my intention to extrapolate the value of the social sciences or digital humanities to UX specifically or in any general manner. However, I would be remiss if I did not note that my unique insights were valued by the design team and that solutions to some of the problems that I describe here are rooted in how my social science training allows for an original and critical analysis of the world around me. For instance, my isolation of platform dependency as an issue and the potential counters to this are rooted in my interest in actor–network theory [76] and the belief that networks of human and non-human actors create social effects. Through engaging these networks, we can generate other social effects and power dynamics. The issues we faced with Adobe XD might have been written off as a simple technical limitation or the cost of doing business, as a part of business as usual. However, I believe my observations—based on my social science training—provided practice value to how we worked, what we created, and how I might do this work in future. Thus, if social scientists or digital humanities practitioners shy away from collaboration out of a belief that they, their knowledge, or their practice will not be valued, then I would disagree and encourage them to challenge this view.
My second and perhaps a more problematic reflection concerns Dourish’s [7] observations on the legitimacy trap and whether corporatised UX is removing opportunities for normative critique. It is beyond the scope of my experience to validate or reject this position—there was simply not scope in my work or experience to offer a strong conclusion here. However, I can reflect more generally on whether my experience did afford opportunities to conduct meaningful, emancipatory, and pro-social work that the social sciences and digital humanities value. While addressing climate change is certainly pro-social, whether a UX product is the vehicle for this and whether I, as a social scientist, was able to meaningfully critique the assumptions and relationships at the nexus of climate change, UX, and social science are unclear to me. It is perhaps this lack of clarity that itself is a barrier for entry. In a purely academic environment, a social scientist or digital humanities professional has both control and visibility over the agendas and impacts he/she is seeking to achieve from his/her work. This is more ambiguous in UX. While a desire to achieve impactful results for stakeholders is a key driver in designing a UX product, whether this provides sufficient motivation for social scientists to participate is unclear to me. Further, given this was my first opportunity to perform UX work, I did not consider how/if I might challenge or problematise the nature of my work and its impacts, should I not find them in alignment with my personal or professional values. I find this question a much more convincing reflection of why more social science and digital humanities professionals are not engaging with UX. I also cannot ignore the empirical evidence that suggests a sense of powerlessness and ethical dissonance for UX practitioners generally [12]. Thus, I view normative disempowerment and misalignment as a potential barrier, which is in need of further investigation and reflection by all those working in this space.
I hope these reflections have informed social scientists and digital humanities professionals on the nature of UX work and the opportunities and challenges therein. Social scientists and digital humanities professionals have a lot to contribute to UX, especially given UX’s ability to connect and engage large audiences in the digital age. It would be a shame if these connections and engagements were the sole domain of corporate interest and not representative of a more critical and pro-social lens, which I believe the social sciences and digital humanities can bring to UX.

Funding

The ERI project and the development of the prototype received funding internally from CSIRO. The ERI project also received funding from commercial engagements with public, commercial, and not-for-profit sectors as part of its ongoing operation, which occurred in parallel to this work. The research contributing to this paper received no funding of any kind.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data was created.

Acknowledgments

I would like to thank Seona Meharg for her feedback on an earlier version of this article and Richa Carneiro Alphonso for her editorial support. I would also like to thank my colleagues in the ERI team for my participation in the project over its course.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Adger, W.N.; Barnett, J.; Brown, K.; Marshall, N.; O’Brien, K. Cultural Dimensions of Climate Change Impacts and Adaptation. Nature Clim. Chang. 2013, 3, 112–117. [Google Scholar] [CrossRef]
  2. Victor, D. Climate Change: Embed the Social Sciences in Climate Policy. Nature 2015, 520, 27–29. [Google Scholar] [CrossRef]
  3. Tauginienė, L.; Butkevičienė, E.; Vohland, K.; Heinisch, B.; Daskolia, M.; Suškevičs, M.; Portela, M.; Balázs, B.; Prūse, B. Citizen Science in the Social Sciences and Humanities: The Power of Interdisciplinarity. Palgrave Commun. 2020, 6, 89. [Google Scholar] [CrossRef]
  4. Hassenzahl, M.; Tractinsky, N. User Experience—A Research Agenda. Behav. Inf. Technol. 2006, 25, 91–97. [Google Scholar] [CrossRef]
  5. Shortt, M.; Tilak, S.; Kuznetcova, I.; Martens, B.; Akinkuolie, B. Gamification in Mobile-Assisted Language Learning: A Systematic Review of Duolingo Literature from Public Release of 2012 to Early 2020. Comput. Assist. Lang. Learn. 2021, 36, 517–554. [Google Scholar] [CrossRef]
  6. Lee, A.J.; Cook, P.S. The Myth of the “Data-driven” Society: Exploring the Interactions of Data Interfaces, Circulations, and Abstractions. Sociol. Compass 2020, 14, e12749. [Google Scholar] [CrossRef]
  7. Dourish, P. User Experience as Legitimacy Trap. Interactions 2019, 26, 47–49. [Google Scholar] [CrossRef]
  8. Gray, C.M.; Kou, Y.; Battles, B.; Hoggatt, J.; Toombs, A.L. The Dark (Patterns) Side of UX Design. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montreal, QC, Canada, 21 April 2018; pp. 1–14. [Google Scholar]
  9. Gunawan, J.; Pradeep, A.; Choffnes, D.; Hartzog, W.; Wilson, C. A Comparative Study of Dark Patterns Across Web and Mobile Modalities. Proc. ACM Hum.-Comput. Interact. 2021, 5, 1–29. [Google Scholar] [CrossRef]
  10. Luguri, J.; Strahilevitz, L.J. Shining a Light on Dark Patterns. J. Leg. Anal. 2021, 13, 43–109. [Google Scholar] [CrossRef]
  11. Monge Roffarello, A.; De Russis, L. Towards Understanding the Dark Patterns That Steal Our Attention. In Proceedings of the CHI Conference on Human Factors in Computing Systems Extended Abstracts, New Orleans, LA, USA, 27 April 2022; pp. 1–7. [Google Scholar]
  12. Beattie, A.; Lacey, C.; Caudwell, C. “It’s like the Wild West”: User Experience (UX) Designers on Ethics and Privacy in Aotearoa New Zealand. Des. Cult. 2023, 1–20. [Google Scholar] [CrossRef]
  13. Rudnicki, S. Not a Mirror but a Tool: User Experience Research and the Production of Useful Social Knowledge. Curr. Sociol. 2023, 71, 337–355. [Google Scholar] [CrossRef]
  14. Lupton, D. Towards Design Sociology. Sociol. Compass 2018, 12, e12546. [Google Scholar] [CrossRef]
  15. Almquist, J.; Lupton, J. Affording Meaning: Design-Oriented Research from the Humanities and Social Sciences. Des. Issues 2010, 26, 3–14. [Google Scholar] [CrossRef]
  16. Buchanan, R. Surroundings and Environments in Fourth Order Design. Des. Issues 2019, 35, 4–22. [Google Scholar] [CrossRef]
  17. DiSalvo, C.; Sengers, P.; Brynjarsdóttir, H. Mapping the Landscape of Sustainable HCI. In Proceedings of the 28th International Conference on Human Factors in Computing Systems—CHI ’10, Atlanta, GA, USA, 10–15 April 2010; p. 1975. [Google Scholar]
  18. Silberman, M.S.; Nathan, L.; Knowles, B.; Bendor, R.; Clear, A.; Håkansson, M.; Dillahunt, T.; Mankoff, J. Next Steps for Sustainable HCI. Interactions 2014, 21, 66–69. [Google Scholar] [CrossRef]
  19. Ferreira, M.; Coelho, M.; Nisi, V.; Jardim Nunes, N. Climate Change Communication in HCI: A Visual Analysis of the Past Decade. In Proceedings of the Creativity and Cognition, Virtual Event, 22–23 June 2021; p. 1. [Google Scholar]
  20. Kuniavsky, M. User Experience and HCI. In The Human-Computer Interaction Handbook; Sears, A., Jacko, A., Jacko, J.A., Eds.; CRC Press: Boca Raton, FL, USA, 2007; pp. 897–916. ISBN 978-0-429-16397-5. [Google Scholar]
  21. Bardzell, J.; Bardzell, S.; DiSalvo, C.; Gaver, W.; Sengers, P. The Humanities and/in HCI. In Proceedings of the CHI ’12 Extended Abstracts on Human Factors in Computing Systems, Austin, TX, USA, 5 May 2012; pp. 1135–1138. [Google Scholar]
  22. Dourish, P.; Finlay, J.; Sengers, P.; Wright, P. Reflective HCI: Towards a Critical Technical Practice. In Proceedings of the CHI ’04 Extended Abstracts on Human Factors in Computing Systems, Vienna, Austria, 24 April 2004; pp. 1727–1728. [Google Scholar]
  23. Mills, C.W. The Sociological Imagination; Oxford University Press: Oxford, UK, 2000; ISBN 0-19-976112-4. [Google Scholar]
  24. Dietz, T.; Shwom, R.L.; Whitley, C.T. Climate Change and Society. Annu. Rev. Sociol. 2020, 46, 135–158. [Google Scholar] [CrossRef]
  25. Weaver, C.P.; Mooney, S.; Allen, D.; Beller-Simms, N.; Fish, T.; Grambsch, A.E.; Hohenstein, W.; Jacobs, K.; Kenney, M.A.; Lane, M.A.; et al. From Global Change Science to Action with Social Sciences. Nat. Clim. Chang. 2014, 4, 656–659. [Google Scholar] [CrossRef]
  26. Spiro, L. “This Is Why We Fight”: Defining the Values of the Digital Humanities. In Debates in the Digital Humanities; Gold, M.K., Ed.; University of Minnesota Press: Minneapolis, MN, USA, 2012; pp. 16–35. ISBN 978-0-8166-7794-8. [Google Scholar]
  27. Burawoy, M. The Critical Turn to Public Sociology. Crit. Sociol. 2005, 31, 313–326. [Google Scholar] [CrossRef]
  28. Burawoy, M. For Public Sociology. Am. Sociol. Rev. 2005, 70, 4–28. [Google Scholar] [CrossRef]
  29. Shi, L.; Moser, S. Transformative Climate Adaptation in the United States: Trends and Prospects. Science 2021, 372, eabc8054. [Google Scholar] [CrossRef]
  30. Wise, R.M.; Capon, T.; Lin, B.B.; Stafford-Smith, M. Pragmatic Cost–Benefit Analysis for Infrastructure Resilience. Nat. Clim. Chang. 2022, 12, 881–883. [Google Scholar] [CrossRef]
  31. CSIRO. Enabling Resilience Investment Guidance. Available online: https://research.csiro.au/enabling-resilience-investment/tools-products-services/guidance/ (accessed on 29 March 2023).
  32. Australian Broadcast Corporation Big Weather (and How to Survive It). Available online: https://iview.abc.net.au/show/big-weather-and-how-to-survive-it (accessed on 29 March 2023).
  33. Body, J.; Terrey, N. Design for a Better Future: A Guide to Designing in Complex Systems; Routledge: London, UK, 2019. [Google Scholar]
  34. Patton, M.Q. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use; Guilford Press: Guilford, UK, 2010. [Google Scholar]
  35. Lazarow, N.; Meharg, S.; Butler, J.R.A.; Connor, J.; Duggan, K.; Roth, C. Evaluating Pathways to Impact for the DFAT-CSIRO Research for Development Alliance; CSIRO: Canberra, Australia, 2015. [Google Scholar]
  36. Odera, E.L. Capturing the Added Value of Participatory Evaluation. Am. J. Eval. 2021, 42, 201–220. [Google Scholar] [CrossRef]
  37. Chouinard, J.A. The Case for Participatory Evaluation in an Era of Accountability. Am. J. Eval. 2013, 34, 237–253. [Google Scholar] [CrossRef]
  38. Flood, M.; Ennis, M.; Ludlow, A.; Sweeney, F.F.; Holton, A.; Morgan, S.; Clarke, C.; Carroll, P.; Mellon, L.; Boland, F.; et al. Research Methods from Human-Centered Design: Potential Applications in Pharmacy and Health Services Research. Res. Soc. Adm. Pharm. 2021, 17, 2036–2043. [Google Scholar] [CrossRef]
  39. Green, J.; Willis, K.; Hughes, E.; Small, R.; Welch, N.; Gibbs, L.; Daly, J. Generating Best Evidence from Qualitative Research: The Role of Data Analysis. Aust. N. Z. J. Public Health 2007, 31, 545–550. [Google Scholar] [CrossRef]
  40. Braun, V.; Clarke, V. Using Thematic Analysis in Psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef]
  41. Franks, T.M. Purpose, Practice, and (Discovery) Process: When Self-Reflection Is the Method. Qual. Inq. 2016, 22, 47–50. [Google Scholar] [CrossRef]
  42. Shaw, R. Embedding Reflexivity Within Experiential Qualitative Psychology. Qual. Res. Psychol. 2010, 7, 233–243. [Google Scholar] [CrossRef]
  43. Macbeth, D. On “Reflexivity” in Qualitative Research: Two Readings, and a Third. Qual. Inq. 2001, 7, 35–68. [Google Scholar] [CrossRef]
  44. Griffin, M.A.; Grote, G. When Is More Uncertainty Better? A Model of Uncertainty Regulation and Effectiveness. Acad. Manag. Rev. 2020, 45, 745–765. [Google Scholar] [CrossRef]
  45. Bates, J.; Lin, Y.-W.; Goodale, P. Data Journeys: Capturing the Socio-Material Constitution of Data Objects and Flows. Big Data Soc. 2016, 3, 1–12. [Google Scholar] [CrossRef]
  46. Choroszewicz, M. Emotional Labour in the Collaborative Data Practices of Repurposing Healthcare Data and Building Data Technologies. Big Data Soc. 2022, 9, 1–12. [Google Scholar] [CrossRef]
  47. Mackenzie, A. Cutting Code: Software and Sociality; Peter Lang: New York, NY, USA, 2006. [Google Scholar]
  48. Thompson, M.A.; Owen, S.; Lindsay, J.M.; Leonard, G.S.; Cronin, S.J. Scientist and Stakeholder Perspectives of Transdisciplinary Research: Early Attitudes, Expectations, and Tensions. Environ. Sci. Policy 2017, 74, 30–39. [Google Scholar] [CrossRef]
  49. Bulten, E.; Hessels, L.K.; Hordijk, M.; Segrave, A.J. Conflicting Roles of Researchers in Sustainability Transitions: Balancing Action and Reflection. Sustain. Sci. 2021, 16, 1269–1283. [Google Scholar] [CrossRef]
  50. Arnold, M.G. The Challenging Role of Researchers Coping with Tensions, Dilemmas and Paradoxes in Transdisciplinary Settings. Sustain. Dev. 2022, 30, 326–342. [Google Scholar] [CrossRef]
  51. O’Donovan, C.; Michalec, A.; Moon, J.R. Capabilities for Transdisciplinary Research. Res. Eval. 2022, 31, 145–158. [Google Scholar] [CrossRef]
  52. Klein, J.T. Learning in Transdisciplinary Collaborations: A Conceptual Vocabulary. In Transdisciplinary Theory, Practice and Education: The Art of Collaborative Research and Collective Learning; Fam, D., Neuhauser, L., Gibbs, P., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 11–23. ISBN 978-3-319-93743-4. [Google Scholar]
  53. Altarriba Bertran, F.; Márquez Segura, E.; Duval, J.; Isbister, K. Chasing Play Potentials: Towards an Increasingly Situated and Emergent Approach to Everyday Play Design. In Proceedings of the 2019 on Designing Interactive Systems Conference, San Diego, CA, USA, 23–28 June 2019; pp. 1265–1277. [Google Scholar]
  54. Costello, B.; Edmonds, E. A Study in Play, Pleasure and Interaction Design. In Proceedings of the 2007 Conference on Designing Pleasurable Products and Interfaces, Helsinki, Finland, 22–25 August 2007; pp. 76–91. [Google Scholar]
  55. Svanaes, D.; Seland, G. Putting the Users Center Stage: Role Playing and Low-Fi Prototyping Enable End Users to Design Mobile Systems. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vienna, Austria, 24–29 April 2004; pp. 479–486. [Google Scholar]
  56. Decuyper, S.; Dochy, F.; Van den Bossche, P. Grasping the Dynamic Complexity of Team Learning: An Integrative Model for Effective Team Learning in Organisations. Educ. Res. Rev. 2010, 5, 111–133. [Google Scholar] [CrossRef]
  57. Fernandez, D.J.; Fernandez, J.D. Agile Project Management—Agilism versus Traditional Approaches. J. Comput. Inf. Syst. 2008, 49, 10–17. [Google Scholar]
  58. Hess, J.D.; Bacigalupo, A.C. Enhancing Decisions and Decision-making Processes through the Application of Emotional Intelligence Skills. Manag. Decis. 2011, 49, 710–721. [Google Scholar] [CrossRef]
  59. Hess, J.D.; Bacigalupo, A.C. Applying Emotional Intelligence Skills to Leadership and Decision Making in Non-Profit Organizations. Adm. Sci. 2013, 3, 202–220. [Google Scholar] [CrossRef]
  60. Rose, S.; Wentzel, D.; Hopp, C.; Kaminski, J. Launching for Success: The Effects of Psychological Distance and Mental Simulation on Funding Decisions and Crowdfunding Performance. J. Bus. Ventur. 2021, 36, 106021. [Google Scholar] [CrossRef]
  61. Peters, K.; Kashima, Y. From Social Talk to Social Action: Shaping the Social Triad with Emotion Sharing. J. Personal. Soc. Psychol. 2007, 93, 780–797. [Google Scholar] [CrossRef] [PubMed]
  62. Ashkanasy, N.M.; Dorris, A.D. Emotions in the Workplace. Annu. Rev. Organ. Psychol. Organ. Behav. 2017, 4, 67–90. [Google Scholar] [CrossRef]
  63. Weik, A.; Redman, A. What Do Key Competencies in Sustainability Offer and How to Use Them. In Competences in Education for Sustainable Development: Critical Perspectives; Springer International Publishing: New York, NY, USA, 2022; pp. 27–34. [Google Scholar]
  64. Guzman, E.; Bruegge, B. Towards Emotional Awareness in Software Development Teams. In Proceedings of the 2013 9th Joint Meeting on Foundations of Software Engineering—ESEC/FSE 2013, Saint Petersburg, Russia, 18–26 August 2013; p. 671. [Google Scholar]
  65. Greener, I. Theorising Path-Dependency: How Does History Come to Matter in Organisations? Manag. Decis. 2002, 40, 614. [Google Scholar] [CrossRef]
  66. Rahman, K.S.; Thelen, K. The Rise of the Platform Business Model and the Transformation of Twenty-First-Century Capitalism. Politics Soc. 2019, 47, 177–204. [Google Scholar] [CrossRef]
  67. Plantin, J.-C.; Lagoze, C.; Edwards, P.N.; Sandvig, C. Infrastructure Studies Meet Platform Studies in the Age of Google and Facebook. New Media Soc. 2018, 20, 293–310. [Google Scholar] [CrossRef]
  68. Guihot, M.; McNaught, H. Platform Power, Technology, and Law: Consumer Powerlessness in Informational Capitalism. Law Innov. Technol. 2021, 13, 510–549. [Google Scholar] [CrossRef]
  69. Bamberger, K.A.; Lobel, O. Platform Market Power. Berkeley Technol. Law J. 2017, 32, 1051–1092. [Google Scholar]
  70. Zuboff, S. The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power; Hachette: New York, NY, USA, 2019. [Google Scholar]
  71. Bodle, R. Regimes of sharing: Open APIs, Interoperability, and Facebook. Inf. Commun. Soc. 2011, 14, 320–337. [Google Scholar] [CrossRef]
  72. Chen, D.; Doumeingts, G.; Vernadat, F. Architectures for Enterprise Integration and Interoperability: Past, Present and Future. Comput. Ind. 2008, 59, 647–659. [Google Scholar] [CrossRef]
  73. Otjacques, B.; Hitzelberger, P.; Feltz, F. Interoperability of E-Government Information Systems: Issues of Identification and Data Sharing. J. Manag. Inf. Syst. 2007, 23, 29–51. [Google Scholar] [CrossRef]
  74. Zeng, M.L. Interoperability. KO Knowl. Organ. 2019, 46, 122–146. [Google Scholar] [CrossRef]
  75. Coleman, B.; Goodwin, D. Designing UX: Prototyping: Because Modern Design Is Never Static; SitePoint Pty Ltd.: Melbourne, Australia, 2017; ISBN 1-4920-1923-2. [Google Scholar]
  76. Latour, B. Reassembling the Social: An Introduction to Actor-Network-Theory; Oxford University Press: Oxford, UK, 2005. [Google Scholar]
Table 1. Reflection methodology for the “Getting to Mount Resilience” (GtMR) demonstrator.
Table 1. Reflection methodology for the “Getting to Mount Resilience” (GtMR) demonstrator.
The Following Project Stages Were Identified and ExploredFor each Project Stage, the Following Questions Were Asked
  • Antecedent;
  • What did you (the team member) need?
2.
Procurement and choice of platform;
2.
Were your expectations met?
3.
Development of prototype;
3.
What were the barriers and enablers to your work?
4.
Test and revise;
4.
What were you thinking and feeling?
5.
Review, response to review, and editing;
6.
Finalisation and publication.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lee, A. What Is It Like to Make a Prototype? Practitioner Reflections on the Intersection of User Experience and Digital Humanities/Social Sciences during the Design and Delivery of the “Getting to Mount Resilience” Prototype. Informatics 2023, 10, 70. https://doi.org/10.3390/informatics10030070

AMA Style

Lee A. What Is It Like to Make a Prototype? Practitioner Reflections on the Intersection of User Experience and Digital Humanities/Social Sciences during the Design and Delivery of the “Getting to Mount Resilience” Prototype. Informatics. 2023; 10(3):70. https://doi.org/10.3390/informatics10030070

Chicago/Turabian Style

Lee, Ashlin. 2023. "What Is It Like to Make a Prototype? Practitioner Reflections on the Intersection of User Experience and Digital Humanities/Social Sciences during the Design and Delivery of the “Getting to Mount Resilience” Prototype" Informatics 10, no. 3: 70. https://doi.org/10.3390/informatics10030070

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop