Next Article in Journal
Construction and Zoning of Ecological Security Patterns in Yichang City
Next Article in Special Issue
From Persuasion Theory to Climate Action: Insights and Future Directions for Increasing Climate-Friendly Behavior
Previous Article in Journal
Environmental Impact Assessment of Toys Toward Sustainable Toy Production and Consumption in Japan
Previous Article in Special Issue
The Carbon Footprint of Clothing Considering the Relationship Between Clothing Layering and Heating Usage
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Enhancing Mobile App Development for Sustainability: Designing and Evaluating the SBAM Design Cards

by
Chiara Tancredi
1,*,†,
Roberta Presta
1,†,
Laura Mancuso
1 and
Roberto Montanari
2
1
Department of Education, Psychology and Communication, University Suor Orsola Benincasa, 80135 Naples, Italy
2
RE:LAB Srl, 42122 Reggio Emilia, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Sustainability 2025, 17(6), 2352; https://doi.org/10.3390/su17062352
Submission received: 23 January 2025 / Revised: 21 February 2025 / Accepted: 28 February 2025 / Published: 7 March 2025
(This article belongs to the Special Issue Environmental Behavior and Climate Change)

Abstract

:
Behavioral changes are critical for addressing sustainability challenges, which have become increasingly urgent due to the growing impact of global greenhouse gas emissions on ecosystems and human livelihoods. However, translating awareness into meaningful action requires practical tools to bridge this gap. Mobile applications, utilizing strategies from human–computer interaction (HCI) such as gamification, nudging, and persuasive technologies, have proven to be powerful in promoting sustainable behaviors. To support designers in developing effective apps of this kind, theory-based design guidelines were created, drawing on established theories and design approaches aimed at shaping and encouraging virtuous user behaviors fostering sustainability. To make these guidelines more accessible and enhance their usability during the design phase, this study presents their transformation into the SBAM card deck, a deck of 11 design cards. The SBAM cards aim to simplify theoretical concepts, stimulate creativity, and provide structured support for design discussions, helping designers generate solutions tailored to specific project contexts. This study also evaluates the effectiveness of the SBAM cards in the design process through two workshops with design students. Results show that the cards enhance ideation, foster creativity, and improve designers’ perceived self-efficacy compared to the exploitation of the same design guidelines information presented in traditional textual formats. This paper discusses the SBAM cards design and evaluation methodology, findings, and implications, offering insights into how the SBAM design cards can bridge the gap between theory and practice in sustainability-focused mobile app development. To ensure broader accessibility, the SBAM cards have been made available to the public through a dedicated website.

1. Introduction

Addressing sustainability has become increasingly critical, considering the current global situation and the following anticipated future challenges: global greenhouse gas emissions are continuously rising, with a projected temperature increase of 3.2 °C by 2100—an unprecedented phenomenon in human history. This is primarily driven by unsustainable practices involving energy usage, land exploitation, and varying consumption and production patterns across regions and individuals [1]. Modern lifestyles—including our modes of transportation, dietary habits (with global food loss and waste accounting for 8% of global greenhouse gas emissions [2]), fashion consumption, and daily resource use—have exacerbated problems like glacier melting, erratic and destructive weather patterns, air pollution, rising sea levels, and climate-induced migration [3]. Individuals in higher socio-economic groups contribute disproportionately to these emissions but also have the most significant potential to mitigate them by altering their behaviors [1]. For instance, European households are responsible for 55% of food waste generation [2]. However, behavioral shifts require consistent support to transform awareness into concrete actions, bridging the gap between knowledge and actual behavior change [3,4,5].
Technology plays a key role in supporting individuals to adopt and sustain new sustainable behaviors, with mobile applications being one of the most effective tools due to their widespread usage and ability to foster personalized engagement (consumers spend an average of four hours and 15 min daily on mobile apps, with a yearly growth rate of 2.5%) [6]. These applications utilize human–computer interaction (HCI) strategies to encourage sustainable lifestyles, influence decision making, and guide behavior towards more environmental-friendly outcomes. Interactive technologies designed to change or influence people’s actions have a substantial theoretical foundation in behavioral science, with a particular focus on behavior change towards sustainability [7,8]. In fact, for sustainability apps to be considered lifestyle-changing, they must be grounded not only in theories of human behavior, such as the Fogg’s Behavior Model (FBM) and the Theory of Planned Behavior (TPB), but also in an understanding of the role of habits in human life [7,9,10]. Moreover, promoting sustainable behaviors involves a specific type of behavioral shift—one that is often perceived as less convenient, more cognitively demanding, and strongly tied to individuals’ personal values [11,12]. Users of digital behavior change interventions (DBCI) and sustainability-focused apps (so-called green users) tend to be individuals who already have a predisposition towards sustainability but require an additional motivational boost. This can be achieved through intrinsic and extrinsic incentives, often leveraging design strategies such as digital nudges and gamification to reinforce behavior change [13,14,15,16,17].
For instance, Sozoniuk’s study [18] demonstrates the effectiveness of apps and ICT tools in developing recycling habits. Mobile apps and games that address sustainability are employed to reduce energy and water consumption, limit food waste, and encourage sustainable mobility practices such as cycling [19,20,21]. Mastorakis et al. [2] developed the FoodSaveShare app, which aims to combat household food waste by combining user data with retailer loyalty information, product databases, and online recipes. Similarly, GoEco [22] is a persuasive app designed to promote sustainable mobility through incentives. It helps users find alternative travel routes, set personal goals, and receive weekly progress reports, while also integrating gamification features like challenges and leaderboards to foster user motivation. Another notable example is an eco-incentive mobile application by Huang et al. [23], which rewards users with eco-credits for recycling end-of-life products, offering tangible incentives such as shopping discounts or donations for tree planting. These eco-credits link consumer behavior to their environmental footprint, reflecting the ecological impacts of their consumption choices.
Isensee et al. [24] investigated how mobile apps can foster a sustainability-oriented corporate culture, focusing on app design and implementation to drive sustainable behaviors through insights from gamification and nudging, supported by expert interviews and corporate app analyses.
Similarly, Tancredi et al. [17] examined psychological theories on habit formation and Digital Behavior Change Intervention (DBCI) approaches—such as digital nudging, persuasive technology, and gamification—to develop design guidelines specifically aimed at creating mobile apps for sustainability, known as the Sustainable Behavior Application for Mobile devices (SBAM) guidelines. The SBAM guidelines, derived from behavioral theories and validated through focus groups with green users, consist of 11 design principles intended to guide designers in developing apps that promote sustainable behaviors based on theoretical foundations. Indeed, designing such apps is challenging due to the multitude of theoretical approaches available to maximize their effectiveness. Moreover, relying solely on design guidelines presented in a tabular format can limit creativity and become overwhelming during the design process. To make theoretical insights more accessible during the design phase, one practical solution is the use of design cards [25,26,27].
Design cards typically transform theoretical frameworks into formats that make these concepts and theories more adaptable to the ideation phase [28]. Recognized as effective “knowledge transfer vehicles” [28], design cards provide step-by-step guidance that allows designers to quickly review and make decisions, thereby ensuring a smooth design flow [29,30]. Their aim is to facilitate a designer’s reflective conversation between existing knowledge and the specific design situation, helping to structure design discussions and ensuring that the design space is considered from multiple perspectives. Design cards can accelerate the refinement and iteration of ideas, initiate design discussions, and refocus the conversation when it becomes unproductive [28]. They have become popular tools in design due to their simplicity, tangibility, and manipulability, making the design process more visible and less abstract.
Design cards are typically hand-sized, featuring both text and images; the information provided on these cards gives designers a shared vocabulary for use in design discussions. They also serve as a physical reference during such discussions, thereby enhancing communication and fostering a common understanding [28,30]. As a low-tech common language, design cards facilitate rich communication [29].
Design cards have been classified through various studies to better understand their different purposes and applications within the design process. Wölfel and Merritt [30] identified several dimensions for categorizing card-based systems, including their purpose, placement in the design process, methodology of use, level of customization, and format. Subsequent studies further classified the purposes of design cards into categories such as systematic design methods, human-centered design, creative thinking, and team building [31,32]. Design cards can be effectively used across different stages of the design process, including research, ideation, prototyping, and implementation, making them versatile tools for fostering both individual creativity and collaborative work [32].
As such, building on the SBAM guidelines [17], an 11-card deck has been developed to assist designers—referred to as SBAM cards—with each card featuring one SBAM guideline on the front and a set of guiding questions on the back to direct the design process.
The rest of the article is organized as follows. The next section presents an explanation of how the SBAM guidelines were translated into design cards. Section 3 shows the materials and methods, as well as the process used to gather feedback, identify needs, and propose improvements to make SBAM cards a more effective tool for designing sustainable mobile apps. Section 4 presents the results from two workshops conducted with design students, which are then discussed in Section 5. Finally, concluding remarks and suggestions for future work are provided in Section 6.

2. The Development of the SBAM Cards

The SBAM guidelines [17] were developed through an in-depth analysis of key psychological theories on behavior and habit formation, as well as Digital Behavior Change Intervention (DBCI) approaches such as digital nudging, persuasive technologies, and gamification. This theoretical investigation was conducted with a specific focus on digital interventions, particularly mobile apps, that can support and encourage individuals in adopting sustainable behaviors. In this context, the authors examine the concepts of green users and green apps, emphasizing how digital solutions can influence users’ lifestyles. The analysis aimed to identify effective design directives for green applications that promote sustainable behaviors, leveraging well-established behavioral theories, as previously cited in the Introduction. In their study, the authors detail the process of deriving the SBAM guidelines from these theoretical foundations, with the goal of distilling key insights into concise, actionable principles that can effectively guide designers. This process involved synthesizing, simplifying, and categorizing the theories into thematic groups, with each group corresponding to a specific guideline. The final guidelines were structured into a tabular format, each including the guideline’s name and a brief, application-oriented description.
To provide designers with a more practical, collaborative, and flexible tool, we transformed the 11 SBAM guidelines into an 11-card deck, leveraging the potential of design cards to enhance the accessibility and usability of the framework developed by Tancredi et al. [17], making it more operational and effective in real-world design processes. Each card features one SBAM guideline on the front, with a set of guiding questions on the back to help guide the design process. The guiding questions are directly addressed to the designer, aiming to support them as much as possible in designing a coherent, meaningful and successful app to change behavior into more sustainable ones. The graphics remain very simple, with a shade of green that evokes the concept of sustainability and the association of an icon that visually communicates the guideline concept without adding unnecessary complexity.
The SBAM cards stand out for their highly specific focus; the design of mobile applications aimed at encouraging sustainable behaviors. To our knowledge, they represent the first example of a card-based tool specifically developed for this purpose. If we were to compare them with existing design cards, the most relevant references would be the notable examples of “The nudge deck” [25] and “The Behavior Change Design (BCD) Cards” [33], as they share a thematic alignment with SBAM cards. However, a key distinction lies in their structural organization—both decks classify cards into different categories based on their function and intended use, whereas all SBAM cards serve the same purpose within the design process. Structurally, SBAM cards bear similarities to these existing decks, particularly in their use of a title-subtitle format with meaningful icons, and in their incorporation of guiding questions, resembling the “Stage cards” in BCD Cards [33]. However, SBAM cards are graphically more minimalist and intentionally avoid providing explicit examples, instead encouraging designers to reflect and ideate solely through guiding questions. This contrasts with both the previously mentioned decks, which frequently include graphical prototype examples to illustrate concepts.
Following the classifications presented before [30,31,32], the SBAM cards can be categorized as follows:
  • Context-specific or domain-specific design: SBAM cards are specifically designed for supporting the design of sustainable mobile apps;
  • Beginning of the process or ideation: they should be used from the initial phase of app design, as they provide support for ideating the concept, broad features, and core topics;
  • No system: SBAM cards can be used without specific instructions or examples of use;
  • No customization:they are intended as a supportive tool, requiring no customization;
  • Text and image: as shown in Figure 1, SBAM cards incorporate both icons and guiding questions.
Below, we present the guiding questions posed for one exemplary guideline, i.e., “Tell a story where the user is part of a whole”. The questions for this guideline are designed to help the designer create a narrative that emphasizes sustainability and social connection as follows:
  • How can your storytelling make the user feel interconnected and part of a collective sustainable change?
  • What sustainability objectives do you want to promote? How can you integrate them into a narrative structure that engages your user?
  • What narrative elements can you use (story, language, characters, colors, etc.)?
All of the SBAM cards and their respective guiding questions are illustrated in Figure 1, which shows both the front and back sides of each card in detail. The SBAM cards are publicly accessible through a dedicated website (https://unisob-snv-humades.notion.site/SBAM-cards-and-guidelines-1744fb4ba2e880a19079c734b7ad95b9?pvs=4, accessed on 20 February 2025. The SBAM cards are distributed under the Creative Commons Attribution 4.0 International license—CC BY 4.0).

3. Materials and Methods

In the following subsection, we outline the procedure used to validate and test the SBAM cards, including the entry and exit questionnaires, the hypotheses formulated, and the data analysis methods employed. This comprehensive approach allowed us to evaluate the effectiveness of the SBAM cards in supporting designers during the ideation of mobile appls encouraging sustainable behaviors and habits development, ensuring that the cards provide practical value and effectively guide the design process.
Our research questions can be framed as follows:
RQ1: 
What is the impact of SBAM cards on participants’ perceived design experience? We expect that the SBAM cards will enhance the following: (i) Participants’ self-efficacy beliefs in designing mobile apps fostering sustainable behaviors (defined as their confidence in their ability to successfully ideate and make decisions related to the design of mobile app of that kind). By providing various techniques along with guiding questions, the cards should help designers formulate a comprehensive set of features for their mobile app projects. (ii) Participants’ experienced creativity. By supporting ideation, reducing the risk of design fixation, and fostering discussions among team members, we expect participants to feel more creative during the design process. (iii) Lastly, we expect participants to perceive the SBAM cards as a useful and easy-to-use tool, as they offer actionable insights in a practical format.
RQ2: 
How do the SBAM cards affect the quality of ideas? We expect the SBAM cards to support designers in creating mobile apps fostering sustainable behaviors and habits that can be described as follows: (i) theoretically grounded, as they align with the SBAM guidelines framework, which integrates behavioral theories, gamification, digital nudging, and other strategies for supporting sustainable behavior change by mobile apps; (ii) creative, as the use of design cards encourages divergent thinking, fostering innovative and diverse design solutions.

3.1. Procedure

To evaluate the effectiveness of the SBAM cards in supporting design teams in creating mobile apps for sustainable behaviors, we conducted two workshops in December 2024. The first workshop involved 19 participants, while the second included 23 participants, of whom 18 completed the exit questionnaire and were therefore included in the analysis. Both groups were university Design students, as follows: the first group consisted of Master’s students, and the second group comprised Bachelor’s students from the Faculty of Design and Architecture of University of Naples Federico II. We used a between-subjects experimental design with the following two conditions: experimental (using the SBAM cards) and control (without the SBAM cards but with textual documentation material). Participants were organized into teams of three–four members each and divided into the following two groups: experimental (first workshop: 3 teams; second workshop: 3 teams—marked as “E”) and control (first workshop: 3 teams; second workshop: 4 teams—marked as “C”).
The participants were informed that the goal of the workshop was to create apps that encourage sustainable behaviors in the following four different domains provided to them: mobility, food waste, energy consumption, and water use. The experimental group was then given a set of SBAM cards per team, with the explanation that these cards are a supportive tool to be consulted and used in the app creation process, allowing them to independently decide how many of the card guidelines to implement (Figure 2). The control group, on the other hand, was provided with theoretical documentation, referred to as “documentation material”, on which the cards were based. This consisted of 5 pages extracted from the SBAM guidelines paper [17].
Each team was given 90 min to ideate and prototype an app, including providing a rationale for design decisions and creating graphical representations of some of the app screens. Before starting, participants were asked to sign a privacy notice and an informed consent form, and they were assigned a unique ID to anonymize their responses to both the entry and the exit questionnaires, which will be explained in detail in the following subsections. After completing the entry questionnaire, participants were given a brief 20-minute presentation on the theories related to sustainable behaviors and habits, digital interventions for behavior and habit change, and gamification. They were then introduced to the four challenges they could choose from—mobility, food waste, energy consumption, and water use. Below, we provide an example of how one of these challenges was presented:
  • Challenge: Sustainable Mobility
  • Problem statement: Cities are facing a crisis of traffic congestion, air pollution, and reliance on private vehicles. According to a report by the International Energy Agency, the transport sector accounts for 24% of global greenhouse gas emissions. In urban areas, air pollution directly contributes to respiratory and cardiovascular diseases. Despite efforts to promote cycling infrastructure and public transport, many people still find it difficult or inconvenient to abandon private car use.
  • Guiding question: How can we help people choose and maintain more sustainable and healthy modes of transportation?
Finally, after completing the exit questionnaire, participants received a certificate of participation. The app projects designed during the workshop sessions were created and presented using a standardized template (as shown in Appendix A), and were then submitted in anonymized form for the design quality evaluation.

3.2. Entry Questionnaire

The entry questionnaire aims not only to gather demographic information about the workshop participants, such as age, gender identity, and education level, but also to collect details about their design background, including years of experience in UI/UX and app design. Additionally, participants were asked about their familiarity and knowledge of sustainability apps. Finally, based on the self-efficacy scale used by Caraban et al. [25,34], participants were asked to self-assess their level of autonomy and effectiveness in performing various tasks using a Likert scale from 0 to 10, as follows:
  • Designing mobile apps that promote sustainable behaviors;
  • Applying design techniques to encourage the adoption of sustainable behaviors through mobile apps;
  • Identifying the best design approaches for mobile apps aimed at promoting the adoption of sustainable behaviors;
  • Justifying the design choices made in the development of mobile apps that encourage sustainable behaviors in users.
The internal reliability of the instrument was high (Cronbach’s alpha = 0.90).

3.3. Exit Questionnaire

The exit questionnaire is divided into several sections to evaluate the experiences of the participants (RQ1) as follows:
  • Self-efficacy scales [25,34]: the same four items of the entry questionnaire are repeated here to allow comparison;
  • Experience Creativity Support Index (CSI) [35]: this section uses twelve 10-point Likert scales to assess six dimensions (collaboration, enjoyment, exploration, expressiveness, immersion, and result worth effort) and includes 15 pairwise comparisons between the six factors;
  • Perceived usability of the tool: we used the SUS (System Usability Scale) by Brooke [36], evaluated with 10 items on a 5-point scale;
  • Perceived usefulness of the tool: by adapting the usefulness dimension of the Technology Acceptance Model (TAM) [37], participants evaluated the cards and documentation material using a 7-point scale;
  • Open-ended questions: two open-ended questions were included to gather participants’ insights on the pros and cons of the tool used.

3.4. Design Quality Evaluation

To answer our second research question, the evaluation process was conducted by a panel of six evaluators, including the first and second authors—who have expertise in behavioral change, gamification, digital nudging, and mobile app design—as well as four experienced mobile app designers specializing in various domains, including sustainability. The projects were anonymized by the third author, ensuring that the evaluators were blind to the conditions (i.e., whether the designs had been created with the support of the SBAM cards). Each evaluator conducted the assessment independently, without collaboration.
To establish a common understanding of the evaluation criteria, the evaluators jointly assessed a small subset of the existing design solutions—specifically, three sustainability-focused mobile apps available on the App Store. This preliminary assessment allowed them to refine the rating strategy before proceeding with the evaluation of the workshop outcomes. The evaluation focused on two key dimensions, assessed using a forced-choice 4-point scale, addressing the following questions:
  • Theoretical grounding: for each SBAM guideline, evaluators assessed the extent to which it was effectively implemented. Ratings ranged from 1 (poor implementation) to 4 (good implementation). If a guideline was not applied or its implementation could not be determined, it was assigned a score of 0, and the guideline rate was excluded from the final calculation. The theoretical grounding score was then calculated as the average rating across all implemented guidelines, considering only those with a score higher than 0.
  • Creativity: the evaluators assessed the originality of each app by answering the following question: “How innovative and original is the app compared to the entire pool?” Creativity was defined as the originality and novelty of the concept in relation to other projects within the evaluation set. This dimension was rated on a scale from 1 (not original at all) to 4 (highly original). To ensure consistency across evaluations, after the initial round of assessments—based also on their expert judgment—evaluators conducted a second review of their ratings in light of the full set of projects, refining their evaluations to maintain coherence within the pool [38].

3.5. Hypotheses and Data Analysis

To address our research questions, we formulated the following hypotheses:
H1: 
(Entry-Exit) self-efficacy of [E] group > (Entry-Exit) self-efficacy of [C] group. This would indicate that the support provided by the SBAM cards made participants feel more self-efficacious in ideating sustainable mobile apps compared to before the workshop.
H2a: 
CSI of [E] group > 50. This would suggest that the SBAM cards effectively supported creativity and team ideation during the design task.
H2b: 
CSI of [E] group > CSI of [C] group. This would suggest that the SBAM cards supported creativity and team ideation more effectively than theoretical concepts organized in written documentation.
H3a: 
SUS of [E] group > 68. This would demonstrate that the SBAM cards are perceived as easy to use, indicating a positive perception.
H3b: 
SUS of [E] group > SUS of [C] group. This would demonstrate that the SBAM cards are perceived as more usable than the documentation material.
H3c: 
Usefulness of [E] group > Usefulness of [C] group. This would demonstrate that the SBAM cards are perceived as more useful than the documentation material.
H4: 
Theoretical grounding of [E] group > Theoretical grounding of [C] group. This would imply that the SBAM cards provide adequate support to designers in applying theoretical foundations, thereby enabling the creation of more effective mobile apps for promoting sustainable behaviors.
H5: 
Creativity of [E] group > Creativity of [C] group. This would verify that the SBAM cards, by supporting the design process, not only enhance the application of solid theoretical foundations for sustainable app design but also foster creativity, providing designers with more room for exploration and innovation.

4. Results

Below, we present the results of the two workshops, initially keeping them separate and then providing comparative figures (with the first workshop represented in yellow using a solid line and the second in orange using a dashed line) before discussing them together.

4.1. First Workshop: 19 Participants

A total of 19 graduate students enrolled in the Master’s program in Design for the Built Environment participated in the first workshop, conducted on 10 December 2024 (14 female, 5 male; median age = 27). Participants had undergraduate degrees in Design, Design for Community and Communication, Architecture and Fine Arts. Reported design experience varied as follows: 7 participants had 1–3 years of experience, 7 had more than 3 years, and 5 had less than 1 year. Regarding experience in mobile app design, 18 participants reported having less than 1 year of experience, while 1 participant reported 1–3 years. On average, participants had prototyped approximately 3 apps (M = 2.68, SD = 1.91). Participants reported moderate prior exposure to sustainability apps (on a 7-point scale, M = 3, SD = 1.79), although they indicated having used at least one sustainability app (M = 1.73, SD = 2.32).
Participants’ self-efficacy scores were relatively high, as follows: following the guidelines for self-efficacy scales by [25,34], scores were reported in centesimal values. The overall mean was M = 69.21 (SD = 10.28). When analyzed by groups, the experimental group scored M = 68.33 (SD = 12.99), while the control group scored M = 70 (SD = 7.73).
The comparison between the experimental group (M = 5.6, SD = 1.14) and the control group (M = 4.7, SD = 1.42) regarding the use of their respective tools did not reveal a statistically significant difference, t = 1.33, p = 0.215, but they both have a medium-high score.
Starting with the first hypothesis, H1: (Entry-Exit) self-efficacy of [E] group > (Entry-Exit) self-efficacy of [C] group, we observed a significant increase in participants’ self-reported efficacy beliefs for the experimental condition (M [Entry E] = 68.33, SD = 12.99; M [Exit E] = 78.89, SD = 12.63; t(8) = −2.72, p = 0.026). The effect size was large (Cohen’s d = 1.09, 95% CI [0.23, 1.94]), indicating a strong increase in self-efficacy. In contrast, no significant increase was observed in the control condition (M [Entry C] = 70.0, SD [Entry C] = 7.73; M [Exit C] = 67.5, SD [Exit C] = 9.93; t(9) = 0.55, p = 0.597), despite an effect size also in the large range (Cohen’s d = 1.29, 95% CI [0.41, 2.17]). To further evaluate the hypothesis, we computed a difference score (delta = Exit - Entry) for each participant and conducted a Welch’s t-test for independent samples. The analysis revealed a statistically significant mean difference score between the experimental group and the control group (delta [E] = 10.56, SD = 11.64; delta [C] = −2.50, SD = 14.43; t(16) = 2.18, p = 0.044). However, the effect size was small and non-significant (Cohen’s d = −0.15, 95% CI [−0.96, 0.65]), suggesting variability in responses. These findings suggest that the experimental group exhibited a statistically significant improvement in self-efficacy compared to the control group. However, the effect size was small and the confidence interval included zero, suggesting some variability in responses, providing a partial support for hypothesis H1 (Figure 3, represented by the yellow solid line).
To verify H2a: CSI of [E] group > 50, the CSI was calculated according to its formula [35], yielding a CSI value for the cards equal to 69.04, which corresponds to a grade of almost “C”. This results confirm the H2a hypothesis.
To verify H2b: CSI of [E] group > CSI of [C] group, an independent-samples t-test was conducted. The results showed significant difference between the experimental group and the control group (M [E] = 69.04, SD = 24.80; M [C] = 46.87, SD = 14.80, t(12) = 2.33, p = 0.037), with a large effect size (Cohen’s d = 1.46, 95% CI [0.66, 2.26]), confirming hypothesis H2b (Figure 4, represented by the yellow solid line).
To verify H3a: SUS of [E] group > 68 [36], the SUS was calculated to assess the perceived usability of the system. The experimental group achieved a mean SUS score of 77.5, indicating a good level of usability and supporting hypothesis H3a.
To further investigate H3b: SUS of [E] group > SUS of [C] group, the SUS scores of both groups were compared. The experimental group, which interacted with the deck of cards, achieved a significantly higher score (77.5, SD = 20.92) compared to the control group, which received theoretical pages (51.25, SD = 6.69). An independent-samples t-test confirmed this difference as statistically significant (t(9) = 3.60, p = 0.005), with a large effect size (Cohen’s d = 1.40, 95% CI [0.42, 2.38]), confirming H3b (Figure 5, represented by the yellow solid line).
To evaluate H3c: Usefulness of [E] group > Usefulness of [C] group, the perceived usefulness scores were calculated as the mean of responses to 10 items from the TAM questionnaire. The experimental group achieved a higher perceived usefulness score (M = 5.60, SD = 1.78) compared to the control group (M = 3.93, SD = 0.85). A Mann–Whitney U test revealed that this difference was statistically significant (U = 73.0, p = 0.012), with a large effect size (Cohen’s d = 1.29, 95% CI [0.51, 2.08]), supporting H3c (Figure 6, represented by the yellow solid line).
Finally, we analyzed the open-ended comments. Regarding the use of the material documentation support, all participants found it difficult to consult and not very supportive in a collaborative context of app design, as follows: “It wasn’t very useful; it was too long, lacked graphics relevant to a digital design course, and felt boring” (C2B).
In contrast, participants who used the cards expressed several positive and constructive opinions. Specifically, it was highlighted that the cards help to “explore ideas” (five participants), facilitate “communication” (three participants), and are “simple to use” (three participants). They were also considered “easy to interpret” (three participants) and useful to “outline a working method” (one participant), as well as improving the quality of apps by providing concrete suggestions to make them more functional (two participants). For instance, participant A1A stated that “The cards helped us stay focused on the main purpose of the app while also providing ideas and ways to make it more interactive, user-friendly, and interesting”.
Some participants recognized that the cards offer “great support for those who are new to this type of work” (two participants) and reflected on the fact that “They cannot all be used in a single app; it is necessary to select only those that are most relevant” (participant A1B). Additionally, some suggestions for improvement emerged, such as the request for a “thematic classification” (one participant) and a “numbered order of visualization” (one participant).

4.2. Second Workshop: 18 Participants

A total of 23 students with a high school diploma enrolled in the Bachelor’s program in Community Design participated in the second workshop, conducted on 17 December 2024. Of these, only 18 completed both questionnaires (Female = 12, Male = 5, Prefer not to answer = 1) and were therefore considered as the final sample, indicating an initial screening based on interest. All 18 participants had completed high school in various fields, such as classical (3), scientific (4), linguistic (3), or artistic high schools (5), and technical institutes specializing in graphics and communication (3). Reported design experience varied as follows: 15 participants had less than 1 year of experience, and 3 had between 1 and 3 years. Regarding experience in mobile app design, 16 participants reported having less than 1 year of experience, while 2 participants reported 1–3 years. On average, participants had prototyped approximately one app (M = 0.83, SD = 2.06). Participants reported limited prior exposure to sustainability apps (on a 7-point scale: M = 2.83, SD = 1.46) and indicated having used, on average, nearly one sustainability app (M = 0.77, SD = 0.94).
Participants’ self-efficacy scores were relatively moderate, as follows: the overall mean was M = 50.14 (SD = 20.53). When analyzed by groups, the experimental group scored M = 55 (SD = 23.49), while the control group scored M = 45.28 (SD = 17.07).
The comparison between the experimental group (M = 5.11, SD = 2.03) and the control group (M = 3.78, SD = 0.67) regarding the use of their respective tools did not reveal a statistically significant difference, t(12) = 1.87, p = 0.091, indicating a medium level of usage.
For the first hypothesis, H1: (Entry-Exit) self-efficacy of [E] group > (Entry-Exit) self-efficacy of [C] group, the data provide strong support. A paired t-test demonstrated a significant increase in self-efficacy for the experimental group (M [Entry E] = 55, SD [Entry E] = 20.53; M [Exit E] = 79.72, SD [Exit E] = 13.14; t(8) = −3.84, p = 0.005), with a very large effect size (Cohen’s d = 1.99, 95% CI [1.03, 2.95]). In contrast, the control group did not exhibit a statistically significant change (M [Entry C] = 45.28, SD [Entry C] = 17.07; M [Exit C] = 52.78, SD [Exit C] = 17.34; t(8) = −1.89, p = 0.096). However, the effect size was moderate (Cohen’s d = 0.69, 95% CI [−0.15, 1.54]). To further explore changes, difference scores (Exit-Entry) were calculated for each participant. A Welch’s t-test revealed that the experimental group (M = 24.72, SD = 19.30) improved significantly more than the control group (M = 7.50, SD = 11.92, t(13) = 2.28, p = 0.040), with a large effect size (Cohen’s d = 0.99, 95% CI [0.11, 1.87]). These findings highlight that participants in the experimental condition experienced a greater enhancement in self-efficacy, thereby confirming hypothesis H1 (Figure 3, represented by the orange dashed line).
To verify H2a: CSI of [E] group > 50, the CSI for the experimental group was calculated as 68.96, corresponding to a grade of approximately “C”, thereby supporting H2a.
To verify H2b: CSI of [E] group > CSI of [C] group, an independent-samples t-test was conducted. The results showed a significant difference between the experimental group and the control group (M [E] = 68.96, SD = 28.02; M [C] = 41.85, SD = 11.69; t(10) = 2.68, p = 0.022), with a large effect size (Cohen’s d = 1.31, 95% CI [0.42, 2.21]). These findings suggest that the experimental group’s creativity support was significantly greater than that of the control group, supporting hypothesis H2b (Figure 4, represented by the orange dashed line).
To verify H3a: SUS of [E] group > 68, the SUS was calculated, achieving score of 80.56, indicating a good level of usability and supporting hypothesis H3a.
To further investigate usability, the SUS scores of both groups were compared (H3b: SUS of [E] group > SUS of [C] group). The difference between the SUS of the experimental group (80.56, SD = 20.83) and the SUS of the control group (53.44, SD = 14.57) was found to be significant using a Mann–Whitney U test (U= 68.5, p = 0.007), with a large effect size (Cohen’s d = 1.47, 95% CI [0.59, 2.34]), confirming H3b (Figure 5, represented by the orange dashed line).
Evaluating H3c: Usefulness of [E] group > Usefulness of [C] group, the experimental group achieved a higher perceived usefulness score (M = 5.46, SD = 1.58) compared to the control group (M = 4.38, SD = 1.00). A Mann–Whitney U test confirmed that this difference is statistically significant (U = 64.0, p = 0.021), with an effect size moderate (Cohen’s d = 0.85, 95% CI [0.02, 1.69]) further supporting H3c (Figure 6, represented by the orange dashed line).
Finally, we analyzed the open-ended comments. Regarding the control group, participants found the material difficult to use in creative and collaborative contexts, describing it as “complex” (seven participants) and “difficult to manage in creative work” (two participants). Some also highlighted that while the material was “informative” (one participant) and “comprehensive” (one participant), its overall complexity hindered its effectiveness in fostering dialogue and collaboration, as stated by participant C6C, as follows: “A material of this type is very difficult to consult in creative and collaborative contexts; it tends to hinder dialogue rather than encourage it”.
In contrast, participants in the experimental group provided several positive and constructive comments about the SBAM cards. They emphasized that the cards “facilitate idea formulation” (four participants) and “teamwork” (five participants), while also being considered “comprehensive” (five participants) and “summarizing” (three participants). For instance, participant E5A stated the following: “They are an excellent guide for designers to understand the types of users they are interacting with and how to approach them effectively on such a complex topic”.
Some participants also acknowledged areas for improvement, such as the need for “a basic understanding of the theories” before using it (one participant) and the suggestion for “more direct guiding questions” (two participants) on the back of the cards. These comments suggest that, overall, the SBAM cards were well received but could benefit from additional refinements to make them even more accessible and user-friendly.

4.3. Design Quality Evaluation

The anonymized projects (a total of 13) were independently evaluated by six expert reviewers based on multiple dimensions. The 13 projects were distributed across four thematic areas as follows: food waste (5), mobility (4), energy consumption (2), and water waste (2).
To evaluate H4: Theoretical grounding of [E] group > Theoretical grounding of [C] group, the mean scores from the six expert ratings for each project were analyzed. The experimental group (E) achieved higher overall scores (M = 2.85, SD = 0.34) compared to the control group (C) (M = 1.84, SD = 0.17). A Mann–Whitney U test revealed a highly significant difference between the two groups (U = 38, p = 0.014), indicating that projects developed by the experimental group received substantially higher evaluations than those from the control group (Figure 7). Additionally, the effect size was very large (r = 0.67), calculated using Rosenthal’s formula for the Mann–Whitney U test due to the small scale of the rating system. This suggests that the observed improvement is not only statistically significant but also practically meaningful. The 95% confidence interval (CI [0.57–1.36]) further confirms the robustness of these results, indicating that the true difference in mean scores between the two groups is likely within this range. These findings support H4, demonstrating that the experimental design approach enhances the theoretical grounding.
The evaluation of project creativity confirmed a statistically significant advantage for the experimental group (H5: Creativity of [E] group > Creativity of [C] group). Projects developed using the SBAM cards received substantially higher creativity ratings compared to those created with the control materials (M [E] = 3, SD [E] = 0.51; M [C] = 1.90, SD [C] = 0.37; Mann–Whitney U = 40.0, p = 0.008).The effect size was large (r = 0.74), indicating a strong and meaningful impact of the SBAM cards on the generation of innovative concepts (Figure 7).

5. Discussion

Both workshops were conducted with students who had a certain familiarity with design-related topics. In the first workshop, participants were Master’s students in Design for the Built Environment, with advanced academic training and greater design experience but limited knowledge in the field of sustainability apps. In the second workshop, participants were undergraduate students in Community Design, who had broader but less specialized training and more limited design experience. These differences likely influenced the way participants engaged with the tools, with the first group demonstrating a more structured approach to design tasks, while the second group may have exhibited a more exploratory attitude. Both groups declared a moderate to high level of tool usage in both workshops ( M 4 on a scale from 1 to 7), suggesting that participants engaged sufficiently with the tools to provide reliable evaluations. This level of engagement supports the validity of the insights drawn from their feedback on the tools.
The results highlight significant differences in self-efficacy (H1) between the experimental and control conditions, as well as notable variations between the two workshops. In both cases, participants who used the SBAM cards showed a greater increase in self-efficacy compared to those who used the control material (a theoretical text). However, the magnitude of the effect was considerably larger in the second workshop (Cohen’s d = 0.99, 95% CI [0.11, 1.87]) than in the first (Cohen’s d = −0.15, 95% CI [−0.96, 0.65]), suggesting that the SBAM cards had a stronger impact on students with less prior design experience. Interestingly, in the first workshop, the control group even experienced a slight decrease in self-efficacy (delta = −2.50), potentially due to the nature of the theoretical material. The extensive format of the control document may have contributed to a sense of overwhelm, reducing participants’ confidence in applying sustainability principles. Conversely, in the second workshop, even the control group showed a slight improvement in self-efficacy (delta = 7.50), though the experimental group still demonstrated a significantly greater increase (delta = 24.72).
These findings suggest that the SBAM cards are particularly beneficial for individuals with less design experience, possibly because they provide a structured yet accessible way to engage with sustainability principles, whereas experienced designers might already rely on their existing knowledge.
In terms of creativity and the support provided by the tools, the Creativity Support Index (CSI) of the experimental group exceeded the threshold of 50 in both workshops (H2a), reaching a medium threshold as follows: the CSI score represents different evaluation ranges (scores below 50 correspond to an ‘F’ grade, while those above 90 correspond to an ‘A’). The SBAM cards achieved an almost ‘C’ grade, serving as a solid foundation while still allowing for slight improvements in fostering creativity. Additionally, the SBAM cards demonstrated greater support for creativity compared to the control material, with statistically significant differences (H2b), with large effect sizes in both workshops (First: Cohen’s d = 1.46, 95% CI [0.66, 2.26]; Second: Cohen’s d = 1.31, 95% CI [0.42, 2.21]). This reinforces the idea that structured design prompts—such as those provided by the cards—can effectively support creative thinking in collaborative design contexts, especially when compared to more traditional theoretical resources.
Similarly, perceived usability (H3a, H3b) showed a clear advantage for the SBAM cards. The SUS scores of the experimental group consistently exceeded the usability benchmark of 68, indicating a good level of usability. The difference between groups was also statistically significant, with large effect sizes (First: Cohen’s d = 1.40, 95% CI [0.42, 2.38]; Second: Cohen’s d = 1.47, 95% CI [0.59, 2.34]). Notably, the SUS score in the second workshop was even higher (80.56) than in the first (77.5), suggesting that undergraduate students found the cards even more intuitive to use compared to more experienced designers.
Perceived usefulness (H3c) was also significantly higher for the SBAM cards compared to the control materials, confirming their added value. However, the effect size was slightly smaller in the second workshop (Cohen’s d = 0.85, 95% CI [0.02, 1.69]) compared to the first (Cohen’s d = 1.29, 95% CI [0.51, 2.08]). This might indicate that participants with more design experience (first workshop) recognized the strategic potential of the cards more easily, while less experienced participants (second workshop) may have needed additional guidance to fully appreciate their usefulness.
The analysis of qualitative comments provided an in-depth perspective on the use of the materials. Participants in the experimental group praised the SBAM cards for their ability to facilitate idea generation, improve teamwork, and provide practical guidelines. However, some areas for improvement emerged, such as adding thematic classifications or more direct guiding questions to simplify card selection. The observation that not all cards are equally useful for every project highlights the importance of flexibility in applying the tool. On the other hand, participants in the control group described the documentation material as complex and difficult to use in collaborative and creative contexts. Several students reported that the documentation material felt too abstract, reinforcing the importance of tangible and structured design tools like the SBAM cards.
Finally, the analysis of project quality confirmed a statistically significant advantage for the experimental group in theoretical grounding (H4). Projects developed with SBAM cards received substantially higher ratings than those from the control group, demonstrating their effectiveness in facilitating the integration of theoretical principles into design, while the difference in scores (M [C] = 1.84; M [E] = 2.85, on a 1–4 scale) is moderate in absolute terms, the large effect size suggests that this improvement is both consistent and meaningful, while also leaving room for further refinement. The results also indicate that the use of SBAM cards significantly enhanced creativity in mobile app design as follows (M [C] = 1.90; M [E] = 3, on a 1–4 scale): the experimental group developed concepts that were rated as significantly more creative than those produced by the control group, with a large effect size (r = 0.74). These findings align with prior research on design tools, suggesting that structured support materials can encourage divergent thinking and ideation by providing inspiration and guiding the creative process.

6. Limitations

While this study provides valuable insights into the use of SBAM cards for sustainability-driven mobile app design, several limitations should be acknowledged.
One limitation of this study is that all participants were design students rather than professional designers. While this choice was intentional—considering that junior designers and recent design graduates are the users who might benefit the most from design cards as an ideation tool—it would be interested to assess how experienced designers would engage with the SBAM cards, since also experienced designer would benefit from a guiding tool given the specificity of the type of design products considered, i.e., green apps promoting sustainable behaviors. Further research is needed to explore whether expert designers find the tool useful, how they integrate it into their workflow, and whether its structured approach to sustainability-driven app design aligns with their existing design processes.
Additionally, the study focused on a single design session, assessing the immediate impact of SBAM cards on ideation and self-efficacy. Given that design processes often unfold over extended periods, it would be valuable to examine how and if the cards continue to be used throughout a full design cycle. A longitudinal study could provide deeper insights into their role in iteration, decision making, and concept refinement over time.
Another consideration is that the use cases provided to participants were predefined to ensure consistency across groups and avoid excessive time spent on problem definition. While this approach maintained comparability, it may have constrained creativity to some extent. Future research could explore more open-ended methodologies, allowing designers to identify their own sustainability challenges and assessing the flexibility of the tool in different contexts.

7. Conclusions

This study introduced and evaluated the SBAM cards as a design tool aimed at supporting the ideation of mobile apps that promote sustainable behaviors. The SBAM cards are available at the following link: https://unisob-snv-humades.notion.site/SBAM-cards-and-guidelines-1744fb4ba2e880a19079c734b7ad95b9?pvs=4, accessed on 20 February 2025. The SBAM card assessment were performed by means of two workshops involving undergraduate and graduate students from the Faculty of Design and Architecture of University of Naples Federico II. The authors had no prior relationship with the students, thus reducing the risk of personal biases or preconceptions that could have affected the evaluation of the tool. The evaluation of the SBAM cards focused on several key dimensions: the theoretical grounding and the creativity of the design choices in the mobile app projects developed using the tool, as well as subjective dimensions such as perceived self-efficacy and experienced creativity of card users in designing mobile app fostering sustainable behavior, as well as the perceived usability and usefulness of the tool itself. These dimensions were assessed during the workshops using a between-subjects experimental design with the following two conditions: experimental (using the SBAM cards) and control (using textual documentation material similar to a scientific paper). The methodology and evaluation tools were adapted from the studies of Caraban et al. [25] and Konstantinou et al. [33], with some adjustments. For instance, the Creativity Support Index (CSI) was employed in its complete, original version, allowing for the calculation of an absolute creativity score supported by the SBAM cards. Additionally, a specific tool—documentation material—was introduced for the control group, enabling a more precise comparison between the experimental and control conditions across creativity, usability, and perceived usefulness. Lastly, usability was assessed using the SUS scale, further complemented by the perceived usefulness scale from TAM to provide a broader measure of evaluation.
The results were highly positive; the SBAM cards proved to be an effective tool for supporting novice designers, demonstrating their value in fostering creativity and enhancing self-efficacy in the design of apps aimed at promoting sustainable behaviors. The stronger impact of the SBAM cards on undergraduate students suggests that such tools can be particularly valuable in early design education, helping novices structure their ideas and engage with sustainability in a practical way. Participants also reported higher perceived utility and usability with the SBAM cards compared to the control material. The SBAM cards were appreciated for their intuitiveness and their ability to facilitate collaborative work and idea generation.
Beyond self-reported measures, the evaluation of project quality confirmed that the SBAM cards significantly improved theoretical grounding (H4) and creativity (H5). Designers using the cards integrated key behavioral change and sustainability principles more effectively, ensuring stronger theoretical foundations. At the same time, the higher creativity scores suggest that the SBAM cards stimulated more innovative and original ideas, fostering divergent thinking. While the absolute score differences were moderate, the large effect sizes indicate that these improvements are systematic and meaningful. Overall, the SBAM cards proved to be a valuable tool for bridging theory and creative ideation in designing mobile app promoting sustainable behaviors and habits.
This study opens several avenues for future research. The analysis of open-ended comments highlighted the need for clearer usage instructions to guide designers in integrating SBAM cards effectively during ideation. This would be particularly beneficial when used by less experienced designers, in extended design sessions, or without preliminary theoretical guidelines. Participants also suggested categorizing the cards and making the guiding questions more direct and actionable. To address this, future iterations could introduce a structured classification system and refine the questions to enhance clarity and usability. To ensure these improvements align with designers’ needs, a co-design approach could be adopted, involving both junior and expert designers in shaping the categorization, question refinement, and instructional materials. This would help optimize the tool’s usability and effectiveness in the ideation of mobile apps improving individual actions for sustainability.
We acknowledge that expanding the variety of case studies could further strengthen the understanding of how SBAM cards support mobile app design across different sustainability challenges. In our study, the four predefined use cases (water waste, energy consumption, mobility, and food waste) were intentionally selected to ensure consistency across participants and prevent excessive time spent on defining a specific problem rather than engaging with the design process itself. By providing structured sustainability challenges, we aimed to focus the evaluation on how participants interacted with the SBAM cards, rather than on their ability to formulate a relevant problem space. However, we recognize the potential benefits of a more open-ended approach, where participants could either define their own sustainability challenge or explore industry-specific applications (e.g., sustainability in retail or smart cities). Future research could explore a more open-ended approach, allowing participants to independently identify environmental challenges for mobile app design while using the SBAM cards as a guiding framework. This could provide deeper insights into how the cards facilitate creative ideation in a different design context, where designers autonomously select the sustainability issue to address, thus expanding upon the current findings.
Another promising direction involves testing the applicability of the SBAM design cards in professional settings, collaborating with companies, municipalities, or small organizations committed in projects aimed at reducing wasteful behaviors at an organizational level. This would help validate their adoption in professional sustainability-driven design processes and assess their adaptability. Furthermore, testing their effectiveness in real-world co-design scenarios would not only confirm their usability in professional contexts but also provide realistic insights into how different stakeholders—such as designers, policymakers, and entrepreneurs—can leverage the tool to create sustainable digital solutions.
Additionally, future studies should investigate how experienced designers interact with the SBAM cards in relevant professional contexts of app development. Exploring their adoption by expert designers could provide insights into whether the cards serve as a useful structured support tool or if their prescriptive nature contrasts with existing design expertise. Understanding this dynamic would help refine the tool’s applicability for both novice and professional designers.
Finally, future research could explore how the SBAM cards are used in real-world design processes, particularly in iterative ideation cycles. While this study assessed their effectiveness in a single 90-minute session, ideation in practice is not necessarily confined to such a timeframe. Instead, it often involves multiple iterations and refinements, in line with a human-centered design approach. Understanding whether and how designers revisit the cards over time—either within the same project or across different projects—would provide deeper insights into their long-term relevance and practical integration within design workflows.

Author Contributions

Conceptualization, R.P.; methodology, R.P. and C.T.; validation, R.P. and C.T.; formal analysis, C.T.; investigation, R.P., C.T. and L.M.; resources, R.P. and C.T.; data curation, C.T. and R.P.; writing—original draft preparation, C.T. and R.P.; writing—review and editing, R.P. and C.T.; visualization, C.T. and R.P.; supervision, R.P. and R.M.; project administration, R.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki, and it was approved by the Ethics Committee of University of Naples Suor Orsola Benincasa (protocol code 4403, 23 September 2024).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data that support the findings of this study are available from University Suor Orsola Benincasa, but restrictions apply to the availability of these data, which were used under license for the current study and so are not publicly available. The data can, however, be made available from the authors upon reasonable request and with the permission of University Suor Orsola Benincasa.

Acknowledgments

The authors would like to thank Pietro Nunziante for making it possible to organize these workshops with their students from the University of Naples Federico II, who are also sincerely acknowledged for their participation. Special thanks go to Rossella Ambrosio for her valuable contribution to the design of the SBAM cards, as well as to the app project design evaluators for their support in assessing the project outcomes.

Conflicts of Interest

Author Roberto Montanari is the owner of the company RE:LAB Srl. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CSICreativity Support Index
SBAMSustainable Behavior Application for Mobile devices
SUSSystem Usability Scale
TAMTechnology Acceptance Model
UX       User Experience
UIUser Interface

Appendix A

In this appendix, we present the PowerPoint template used by the participants to submit their projects.
Figure A1. First slide of the PowerPoint template.
Figure A1. First slide of the PowerPoint template.
Sustainability 17 02352 g0a1
Figure A2. Second slide of the PowerPoint template.
Figure A2. Second slide of the PowerPoint template.
Sustainability 17 02352 g0a2
Figure A3. Third slide of the PowerPoint template.
Figure A3. Third slide of the PowerPoint template.
Sustainability 17 02352 g0a3
Figure A4. Fourth slide of the PowerPoint template.
Figure A4. Fourth slide of the PowerPoint template.
Sustainability 17 02352 g0a4
Figure A5. Fifth slide of the PowerPoint template.
Figure A5. Fifth slide of the PowerPoint template.
Sustainability 17 02352 g0a5

References

  1. Lee, H.; Calvin, K.; Dasgupta, D.; Krinner, G.; Mukherji, A.; Thorne, P.; Trisos, C.; Romero, J.; Aldunce, P.; Barret, K.; et al. IPCC, 2023: Climate Change 2023: Synthesis Report, Summary for Policymakers. Contribution of Working Groups I, II and III to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change; Core Writing Team, Lee, H., Romero, J., Eds.; IPCC: Geneva, Switzerland, 2023. [Google Scholar]
  2. Mastorakis, G.; Kopanakis, I.; Makridis, J.; Chroni, C.; Synani, K.; Lasaridi, K.; Abeliotis, K.; Louloudakis, I.; Daliakopoulos, I.N.; Manios, T. Managing Household Food Waste with the FoodSaveShare Mobile Application. Sustainability 2024, 16, 2800. [Google Scholar] [CrossRef]
  3. Peeters, A.L.; van der Werff, E.; Tromp, N. Designing for value-behaviour consistency: Ethical choice architecture to stimulate sustainable meat purchase. Clean. Responsib. Consum. 2022, 5, 100067. [Google Scholar] [CrossRef]
  4. Blake, J. Overcoming the ‘value-action gap’ in environmental policy: Tensions between national policy and local experience. Local Environ. 1999, 4, 257–278. [Google Scholar] [CrossRef]
  5. Vermeir, I.; Verbeke, W. Sustainable food consumption: Exploring the consumer “attitude–behavioral intention” gap. J. Agric. Environ. Ethics 2006, 19, 169–194. [Google Scholar] [CrossRef]
  6. Doğan-Südaş, H.; Kara, A.; Karaca, E. Effects of Gamified Mobile Apps on Purchase Intentions and Word-of-Mouth Engagement: Implications for Sustainability Behavior. Sustainability 2023, 15, 506. [Google Scholar] [CrossRef]
  7. Pinder, C.; Vermeulen, J.; Cowan, B.R.; Beale, R. Digital behaviour change interventions to break and form habits. ACM Trans. Comput.-Hum. Interact. (TOCHI) 2018, 25, 1–66. [Google Scholar] [CrossRef]
  8. Egan, C.; Benyon, D. Sustainable HCI: Blending permaculture and user-experience. In Proceedings of the 2017 ACM Conference Companion Publication on Designing Interactive Systems, Edinburgh, UK, 10–14 June 2017; pp. 39–43. [Google Scholar]
  9. Fogg, B.J. A behavior model for persuasive design. In Proceedings of the 4th International Conference on Persuasive Technology, Claremont, CA, USA, 26–29 April 2009; pp. 1–7. [Google Scholar]
  10. Ajzen, I.; Fishbein, M.; Lohmann, S.; Albarracín, D. The influence of attitudes on behavior. In The Handbook of Attitudes, Volume 1: Basic Principles; Routledge: New York, NY, USA, 2018; pp. 197–255. [Google Scholar]
  11. Verplanken, B.; Holland, R.W. Motivated decision making: Effects of activation and self-centrality of values on choices and behavior. J. Personal. Soc. Psychol. 2002, 82, 434. [Google Scholar] [CrossRef] [PubMed]
  12. Maiteny, P.T. Mind in the Gap: Summary of research exploring ‘inner’ influences on pro-sustainability learning and behaviour. Environ. Educ. Res. 2002, 8, 299–306. [Google Scholar] [CrossRef]
  13. Ryan, R.M.; Deci, E.L. Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemp. Educ. Psychol. 2000, 25, 54–67. [Google Scholar] [CrossRef]
  14. Bergram, K.; Djokovic, M.; Bezençon, V.; Holzer, A. The digital landscape of nudging: A systematic literature review of empirical research on digital nudges. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA, 30 April–5 May 2022; pp. 1–16. [Google Scholar]
  15. Caraban, A.; Karapanos, E.; Goncalves, D.; Campos, P. 23 ways to nudge: A review of technology-mediated nudging in human–computer interaction. In Proceedings of the 2019 CHI conference on human factors in computing systems, Glasgow, Scotland, UK, 4–9 May 2019; pp. 1–15. [Google Scholar]
  16. Deterding, S.; Dixon, D.; Khaled, R.; Nacke, L. From game design elements to gamefulness: Defining “gamification”. In Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments, Tampere, Finland, 28–30 September 2011; pp. 9–15. [Google Scholar]
  17. Tancredi, C.; Presta, R.; Di Lorenzo, V. Promoting sustainable behaviors through mobile apps: SBAM design guidelines. Multimed. Tools Appl. 2024, 83, 74021–74052. [Google Scholar] [CrossRef]
  18. Sozoniuk, M.; Park, J.; Lumby, N. Investigating Residents’ Acceptance of Mobile Apps for Household Recycling: A Case Study of New Jersey. Sustainability 2022, 14, 10874. [Google Scholar] [CrossRef]
  19. Boncu, S.; Candel, O.S.; Popa, N.L. Gameful Green: A Systematic Review on the Use of Serious Computer Games and Gamified Mobile Apps to Foster Pro-Environmental Information, Attitudes and Behaviors. Sustainability 2022, 14, 10400. [Google Scholar] [CrossRef]
  20. Meireles, M.; Ribeiro, P.J.G. Digital Platform/Mobile App to Boost Cycling for the Promotion of Sustainable Mobility in Mid-Sized Starter Cycling Cities. Sustainability 2020, 12, 2064. [Google Scholar] [CrossRef]
  21. Mu, W.; Spaargaren, G.; Oude Lansink, A. Mobile Apps for Green Food Practices and the Role for Consumers: A Case Study on Dining Out Practices with Chinese and Dutch Young Consumers. Sustainability 2019, 11, 1275. [Google Scholar] [CrossRef]
  22. Cellina, F.; Bucher, D.; Veiga Simão, J.; Rudel, R.; Raubal, M. Beyond limitations of current behaviour change apps for sustainable mobility: Insights from a user-centered design and evaluation process. Sustainability 2019, 11, 2281. [Google Scholar] [CrossRef]
  23. Huang, H.; Su, D.; Peng, W. Novel Mobile Application System for Implementation of an Eco-Incentive Scheme. Sustainability 2022, 14, 3055. [Google Scholar] [CrossRef]
  24. Isensee, C.; Teuteberg, F.; Griese, K.M. Exploring the Use of Mobile Apps for Fostering Sustainability-Oriented Corporate Culture: A Qualitative Analysis. Sustainability 2022, 14, 7380. [Google Scholar] [CrossRef]
  25. Caraban, A.; Konstantinou, L.; Karapanos, E. The nudge deck: A design support tool for technology-mediated nudging. In Proceedings of the 2020 ACM Designing Interactive Systems Conference, Eindhoven, The Netherlands, 6–10 July 2020; pp. 395–406. [Google Scholar]
  26. Elsayed-Ali, S.; Berger, S.E.; Santana, V.F.D.; Becerra Sandoval, J.C. Responsible & Inclusive Cards: An online card tool to promote critical reflection in technology industry work practices. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–14. [Google Scholar]
  27. Lucero, A.; Dalsgaard, P.; Halskov, K.; Buur, J. Designing with cards. In Collaboration in Creative Design: Methods and Tools; Springer: Berlin/Heidelberg, Germany, 2016; pp. 75–95. [Google Scholar]
  28. Deng, Y.; Antle, A.N.; Neustaedter, C. Tango cards: A card-based design tool for informing the design of tangible learning games. In Proceedings of the 2014 Conference on Designing Interactive Systems, Vancouver, BC, Canada, 21–25 June 2014; pp. 695–704. [Google Scholar]
  29. Lafrenière, D.; Dayton, T.; Muller, M. Variations of a theme: Card-based techniques for participatory analysis and design. In Proceedings of the CHI’99 Extended Abstracts on Human Factors in Computing Systems, Pittsburgh, PA, USA, 15–20 May 1999; pp. 151–152. [Google Scholar]
  30. Wölfel, C.; Merritt, T. Method card design dimensions: A survey of card-based design tools. In Proceedings of the Human-Computer Interaction—INTERACT 2013: 14th IFIP TC 13 International Conference, Cape Town, South Africa, 2–6 September 2013; Proceedings, Part I 14. Springer: Berlin/Heidelberg, Germany, 2013; pp. 479–486. [Google Scholar]
  31. Roy, R.; Warren, J.P. Card-based design tools: A review and analysis of 155 card decks for designers and designing. Des. Stud. 2019, 63, 125–154. [Google Scholar] [CrossRef]
  32. Hsieh, G.; Halperin, B.A.; Schmitz, E.; Chew, Y.N.; Tseng, Y.C. What is in the cards: Exploring uses, patterns, and trends in design cards. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, Hamburg, Germany, 23–28 April 2023; pp. 1–18. [Google Scholar]
  33. Konstanti, C.; Karapanos, E.; Markopoulos, P. The Behavior Change Design Cards: A Design Support Tool for Theoretically-Grounded Design of Behavior Change Technologies. Int. J. Hum.-Comput. Interact. 2022, 38, 1238–1254. [Google Scholar] [CrossRef]
  34. Bandura, A. Guide for constructing self-efficacy scales. Self-Effic. Beliefs Adolesc. 2006, 5, 307–337. [Google Scholar]
  35. Cherry, E.; Latulipe, C. Quantifying the creativity support of digital tools through the creativity support index. ACM Trans. Comput.-Hum. Interact. (TOCHI) 2014, 21, 1–25. [Google Scholar] [CrossRef]
  36. Brooke, J. SUS: A “quick and dirty” Usability Scale. In Usability Evaluation in INdustry/Taylor and Francis; CRC Press: Boca Raton, FL, USA, 1996. [Google Scholar]
  37. Davis, F.D. Technology acceptance model: TAM. In Information Seeking Behavior and Technology Adoption; Al-Suqri, M.N., Al-Aufi, A.S., Eds.; IGI Global Scientific Publication: Hershey, PA, USA, 1989; Volume 205, p. 219. [Google Scholar]
  38. Amabile, T.M. Social psychology of creativity: A consensual assessment technique. J. Personal. Soc. Psychol. 1982, 43, 997. [Google Scholar] [CrossRef]
Figure 1. SBAM cards, developed based on the SBAM guidelines proposed by Tancredi et al. [17], aimed at supporting the design of mobile apps fostering sustainable behaviors.
Figure 1. SBAM cards, developed based on the SBAM guidelines proposed by Tancredi et al. [17], aimed at supporting the design of mobile apps fostering sustainable behaviors.
Sustainability 17 02352 g001
Figure 2. Participants using the SBAM cards during the workshop.
Figure 2. Participants using the SBAM cards during the workshop.
Sustainability 17 02352 g002
Figure 3. Comparison of self-efficacy scores (Entry and Exit) for the control group (on the left) and for the experimental group (on the right) in the first and second workshops. The experimental group in both workshops shows significant increases in Exit scores compared to Entry scores.
Figure 3. Comparison of self-efficacy scores (Entry and Exit) for the control group (on the left) and for the experimental group (on the right) in the first and second workshops. The experimental group in both workshops shows significant increases in Exit scores compared to Entry scores.
Sustainability 17 02352 g003
Figure 4. Comparison of CSI scores between experimental and control groups across the first and second workshops. The experimental group consistently outperformed the control group, with significant differences observed in both workshops.
Figure 4. Comparison of CSI scores between experimental and control groups across the first and second workshops. The experimental group consistently outperformed the control group, with significant differences observed in both workshops.
Sustainability 17 02352 g004
Figure 5. Comparison of SUS scores between experimental and control groups across the first and second workshops. The experimental group consistently scored higher than the control group, with significant differences in both workshops.
Figure 5. Comparison of SUS scores between experimental and control groups across the first and second workshops. The experimental group consistently scored higher than the control group, with significant differences in both workshops.
Sustainability 17 02352 g005
Figure 6. Comparison of perceived usefulness scores between experimental and control groups in the first and second workshops. The experimental group achieved higher scores in both workshops, but the differences were not statistically significant.
Figure 6. Comparison of perceived usefulness scores between experimental and control groups in the first and second workshops. The experimental group achieved higher scores in both workshops, but the differences were not statistically significant.
Sustainability 17 02352 g006
Figure 7. Comparison of scores for the Theoretical grounding and Creativity dimensions between experimental and control groups, as rated by the design quality evaluators. Both differences are statistically significant.
Figure 7. Comparison of scores for the Theoretical grounding and Creativity dimensions between experimental and control groups, as rated by the design quality evaluators. Both differences are statistically significant.
Sustainability 17 02352 g007
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Tancredi, C.; Presta, R.; Mancuso, L.; Montanari, R. Enhancing Mobile App Development for Sustainability: Designing and Evaluating the SBAM Design Cards. Sustainability 2025, 17, 2352. https://doi.org/10.3390/su17062352

AMA Style

Tancredi C, Presta R, Mancuso L, Montanari R. Enhancing Mobile App Development for Sustainability: Designing and Evaluating the SBAM Design Cards. Sustainability. 2025; 17(6):2352. https://doi.org/10.3390/su17062352

Chicago/Turabian Style

Tancredi, Chiara, Roberta Presta, Laura Mancuso, and Roberto Montanari. 2025. "Enhancing Mobile App Development for Sustainability: Designing and Evaluating the SBAM Design Cards" Sustainability 17, no. 6: 2352. https://doi.org/10.3390/su17062352

APA Style

Tancredi, C., Presta, R., Mancuso, L., & Montanari, R. (2025). Enhancing Mobile App Development for Sustainability: Designing and Evaluating the SBAM Design Cards. Sustainability, 17(6), 2352. https://doi.org/10.3390/su17062352

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop