Next Article in Journal
What You Want Is Not Always What You Get: Gender Differences in Employer-Employee Exchange Relationships during the COVID-19 Pandemic
Next Article in Special Issue
The Role of Human Support on Engagement in an Online Depression Prevention Program for Youth
Previous Article in Journal
“Divide, Divert, & Conquer” Deconstructing the Presidential Framing of White Supremacy in the COVID-19 Era
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital Overload among College Students: Implications for Mental Health App Use

by
Arielle C. Smith
1,
Lauren A. Fowler
1,
Andrea K. Graham
2,
Beth K. Jaworski
3,
Marie-Laure Firebaugh
1,
Grace E. Monterubio
1,
Melissa M. Vázquez
1,
Bianca DePietro
1,
Shiri Sadeh-Sharvit
4,
Katherine N. Balantekin
5,
Naira Topooco
4,6,
Denise E. Wilfley
1,
C. Barr Taylor
4,7 and
Ellen E. Fitzsimmons-Craft
1,*
1
Department of Psychiatry, Washington University School of Medicine, St. Louis, MO 63110, USA
2
Center for Behavioral Intervention Technologies, Northwestern University Feinberg School of Medicine, Chicago, IL 60611, USA
3
National Center for PTSD, Dissemination and Training Division, VA Palo Alto Health Care System, Menlo Park, CA 94025, USA
4
Center for m2Health, Palo Alto University, Palo Alto, CA 94304, USA
5
Department of Exercise and Nutrition Sciences, University at Buffalo, Buffalo, NY 14214, USA
6
Department of Behavioural Sciences and Learning, Linköping University, SE-581 83 Linköping, Sweden
7
Department of Psychiatry and Behavioral Sciences, Stanford University School of Medicine, Stanford, CA 94305, USA
*
Author to whom correspondence should be addressed.
Soc. Sci. 2021, 10(8), 279; https://doi.org/10.3390/socsci10080279
Submission received: 5 June 2021 / Revised: 13 July 2021 / Accepted: 16 July 2021 / Published: 21 July 2021
(This article belongs to the Special Issue Technological Approaches for the Treatment of Mental Health in Youth)

Abstract

:
Mental health phone applications (apps) provide cost-effective, easily accessible support for college students, yet long-term engagement is often low. Digital overload, defined as information burden from technological devices, may contribute to disengagement from mental health apps. This study aimed to explore the influence of digital overload and phone use preferences on mental health app use among college students, with the goal of informing how notifications could be designed to improve engagement in mental health apps for this population. A semi-structured interview guide was developed to collect quantitative data on phone use and notifications as well as qualitative data on digital overload and preferences for notifications and phone use. Interview transcripts from 12 college students were analyzed using thematic analysis. Participants had high daily phone use and received large quantities of notifications. They employed organization and management strategies to filter information and mitigate the negative effects of digital overload. Digital overload was not cited as a primary barrier to mental health app engagement, but participants ignored notifications for other reasons. Findings suggest that adding notifications to mental health apps may not substantially improve engagement unless additional factors are considered, such as users’ motivation and preferences.

1. Introduction

The prevalence of mental health concerns among college students has been increasing in recent years (American College Health Association 2019; Lipson et al. 2019; Oswalt et al. 2020). College administrators have thus been tasked with the challenge of providing mental health services to the growing number of students in need. Mobile health (mHealth) technologies, which include smartphone applications (apps) for mental health, provide a unique mode for the scalable delivery of mental health support services to college students (Johnson and Kalkbrenner 2017; Lattie et al. 2019). The benefits of mHealth are two-fold: students can easily access private and confidential features of mHealth interventions, and college administrators can provide low-cost interventions on a large scale (Montagni et al. 2020). Additionally, with 96% of US adults between the ages of 18 and 29 owning smartphones (Pew Research Center 2021), it is logical to presume that college students may be well-suited for mHealth service utilization. While the self-report and objective data on young adults’ phone screen time varies across studies, college students are consistently spending multiple hours on their phones daily (Roberts et al. 2014; Andrews et al. 2015; David et al. 2018; Ataş and Çelik 2019), suggesting that using a mental health app aligns with the existing behavioral routines of this population.
In addition to young adults’ propensity for phone use, young adults and college students specifically have reported interest in and willingness to engage with digital mental health services (Ahuvia et al. 2021; Cohen et al. 2021), and mHealth has been recommended by college students as a potential strategy for improving mental health care on college campuses (Cohen et al. 2020). Despite college students’ expressed interest in mental health apps and the potential benefits of incorporating apps into college mental health services, the intended impact cannot occur if engagement with such apps is low. Indeed, engagement rates for mental health apps are suboptimal, particularly in the long-term. An analysis of data from 93 mental health apps showed that rates of opening the apps declined by 80% between the first and tenth day, and that the median 15-day retention was 3.9% (Baumel et al. 2019). In a survey of 741 college students, less than a quarter of the students who had used a mental health app continued to use it after four weeks (Kern et al. 2018). Exploring college students’ attitudes toward engagement strategies and their general phone use preferences can provide user-driven suggestions for how this population may successfully integrate mental health apps into their consistent phone use and result in more students engaging with efficacious mHealth services.
One factor that may contribute to the poor use of mental health apps by college students is a phenomenon referred to as “digital overload.” Information overload in the digital context (i.e., digital overload), occurs when a high rate of information coming from multiple communication channels inhibits capacity to process information efficiently or use information effectively (Bawden and Robinson 2009; Misra and Stokols 2012; Lehman and Miller 2020). With regards to smartphone usage, digital overload may result from simultaneously receiving information across multiple sources within the same device, such as text messages, phone calls, and notifications or alerts from mobile apps. Of note, the quantity of digital information received may not singlehandedly predict the experience of overload. Misra and Stokols (2012) noted that “perceived information overload” is a form of psychological stress. Similarly, research on the concept of “digital stress” emphasizes that it is the subjective experience of a stimulus (e.g., a given quantity of notifications) which is perceived as a stressor and thus varies between individuals (Steele et al. 2020). “Connection overload”—“distress resulting from the subjective experience of receiving excessive input from digital sources”—has been conceptualized as one component of digital stress (Steele et al. 2020). The emphasis on the subjective experience in this definition is notable because connection demands using objective units alone have been negatively associated with negative affect, whereas accounting for the subjective experiences of connection overload (e.g., deficient self-reaction, negative outcomes, stress) produces a positive association with negative affect (LaRose et al. 2014).
Digital overload and related concepts have a range of outcomes. Reactions to digital overload include: (1) information anxiety, a state of stress in which people feel powerless due to their inability to access, understand, or use necessary information, (2) information avoidance, in which people ignore useful information because there is too much of it to process, or (3) information withdrawal, in which people devise filtering strategies to sort through the bare minimum of relevant information (Bawden and Robinson 2009). Among college students specifically, digital overload contributes to lack of focus, decreased self-confidence, and increased stress and anxiety (Renjith 2017). In another sample of college students, higher perceived information overload at one point of contact predicted higher levels of perceived stress and poorer health status at the second point of contact (Misra and Stokols 2012). Additionally, measures of communication load that incorporate objective data (e.g., emails sent and received) as well as subjective experiences (e.g., perceived urge to check email) have been shown to be positively correlated with perceived stress and indirectly associated with burnout, depression, and anxiety (Reinecke et al. 2017). Understanding how digital overload affects college students and how it may relate to their use of mental health apps and attention toward notifications from these apps could allow for more tailored approaches to maximize user engagement with mobile mental health programs.
Therefore, the aim of this study was to explore how digital overload and other phone use factors might relate to the use of mental health apps among college students. We focused on college students, given our team’s ongoing research studying the effectiveness of a mobile platform for college students with or at high risk of depression, anxiety, and/or eating disorders (Fitzsimmons-Craft et al. 2021). Semi-structured interviews were conducted to collect quantitative data on phone use and notifications as well as qualitative data on (1) the digital overload phenomenon, (2) notification preferences and other phone use preferences that may generalize to mental health apps, and (3) recommendations on how mental health app use can be improved. These findings have the potential to explain how competing phone use demands contribute to disengagement from mental health apps and how this disengagement may be circumvented through app design decisions.

2. Materials and Methods

2.1. Participants

Participants were individuals who were: (1) 18 years of age or older, (2) currently enrolled as an undergraduate student at a university in the United States, (3) endorsed recently (i.e., within the last year) or currently using a smartphone app designed to address mental health concerns. Out of 71 individuals who provided consent and initiated screening, 33 (46.48%) were eligible for the study. Of those eligible, 13 participants were selected to participate using maximum variation purposeful sampling (Palinkas et al. 2015; Patton 2002). Participants were selected to ensure a diverse sample based on gender identity, race, ethnicity, and undergraduate institution, so as to maximize generalizability. However, one of the 13 participants was excluded from data analysis because the primary mental health condition they disclosed experiencing, autism, was not a focus of the mobile mental health platform under study in the parent project (Fitzsimmons-Craft et al. 2021) and is a less common concern addressed by college student mental health centers (Center for Collegiate Mental Health 2021). The characteristics (gender, race and ethnicity, age, type of university, and year in school) of the 12 participants whose data was analyzed, along with their corresponding ID numbers, are presented in Table 1 below.
This sample size was deemed appropriate because it is consistent with published studies in the field of human-computer interaction, in which the most common sample size is 12 (Caine 2016). Furthermore, themes have been shown to emerge in as few as six interviews and saturation has occurred after 12 interviews in homogenous samples (Guest et al. 2006; Guest et al. 2020). While our sample was intentionally heterogenous with the use of maximum variation purposeful sampling, inductive thematic saturation, defined as the non-emergence of new themes (Saunders et al. 2018), was reached with the 12 interviews, given that no new themes were identified when A.C.S. and L.A.F. met to discuss the last set of interviews each had coded.

2.2. Procedure

Study personnel contacted students at multiple universities to ask if they would distribute institutional review board (IRB)-approved recruitment materials (i.e., email, text message, digital flyer, social media post) to their online communities. The recruitment materials directed interested students to a Qualtrics survey, where they provided informed consent and were assessed for eligibility.
Eligible participants selected for participation were contacted by phone and text to schedule the videoconference interview, and they were also asked to enable screen time data collection on their phones if they had not done so already. Interviews with participants were conducted, recorded, and preliminarily transcribed by a HIPAA (Health Insurance Portability and Accountability Act)-compliant videoconference platform, Zoom. To gather data that could help answer the research question of interest (i.e., how, if at all, do college students experience digital overload as it relates to the use of mental health apps?), the authors developed a semi-structured interview guide, in line with the theoretical approach to thematic analysis (Braun and Clarke 2006). Theoretical thematic analysis was the approach selected because data related to the specific research question was the focus of coding (Braun and Clarke 2006). Nonetheless, this was an exploratory study, so no existing theory framed the interview guide, nor was the purpose of the study to construct a theory. The full interview guide is presented in the Supplementary Materials. The interview began with collecting participants’ screen time and notifications data as tracked within their phone settings, followed by asking participants questions regarding how they use and organize their phones, respond to notifications, and use apps for mental health purposes. The interview duration ranged from 41 to 58 min (mean = 53.15, SD = 4.51). After their videoconference interview, each participant was emailed an electronic Amazon gift card worth USD 20. All procedures and recruitment materials were approved by the Institutional Review Board (IRB ID #202009046).

2.3. Data Analysis

The six-phase thematic analysis framework (Braun and Clarke 2006), incorporating counting considerations from Hannah and Lautsch (2011), was used to analyze and present the data. Thematic analysis is a method for identifying, analyzing, and reporting patterns within data (Braun and Clarke 2006). For step one—familiarizing yourself with the data—the research team transferred Zoom-generated verbatim transcripts to Microsoft Word documents and checked each transcript against its respective recording to ensure accuracy. Next, two team members (A.C.S., L.A.F.) read each transcript. For step two—generating initial codes—the two team members coded each transcript separately. For step three—searching for themes—the two team members met on multiple occasions to merge related codes, compare merged codes, resolve inconsistencies in coding, and produce a collection of candidate themes, subthemes, and the data that had been coded in relation to them. For step four—reviewing themes—A.C.S. read the data that had been collated for each theme to appraise if they formed coherent patterns and then reread all the transcripts to determine if the themes reflected the meaning present in the dataset as a whole, making refinements at both stages as necessary. Additionally, a third team member (M.-L.F.), blind to initial codes, generated codes from a subset of randomly selected transcripts, which were compared to themes from step three. For step five—defining and naming themes—the themes and subthemes were labelled, presented to the wider research team for feedback, and unclearly defined themes were discussed and renamed by A.C.S., L.A.F., and senior authors. For step six—producing the report—the finalized themes, subthemes, and illustrative quotes were arranged into a table presented in the Supplementary Materials.

3. Results

3.1. Screen Time and Notifications Data

Eleven of the 12 participants provided phone use data. Total phone use from the previous day ranged from 4 h and 4 min to 14 h and 14 min, with an average use of 8 h and 55 min (SD = 3 h and 18 min). Participants received an average of 259 notifications in the past day (SD = 143) and picked up their devices an average of 108 times (SD = 54). Of the top five most used apps, participants used messaging apps such as iMessage and Facebook Messenger (n = 8), Instagram (n = 7), TikTok (n = 5), Snapchat (n = 5), and web browsers such as Google Chrome and Safari (n = 5) most frequently. Of the top five apps that sent the most notifications, messaging apps (n =10), Snapchat (n = 7), email apps such as Gmail and Outlook (n = 5), Instagram (n = 4), and the clock app (n = 4), were reported the most.

3.2. Themes

We identified six major themes related to participants’ attitudes and preferences regarding phone use and notifications, as well as how these factors impact engagement with mental health apps. Each of these themes along with subthemes and illustrative quotes from the data are discussed below and summarized in the Supplementary Materials. Participants are referred to by ID number (e.g., P1).

3.2.1. Attitudes and Behaviors toward General Phone Use

Participants were aware of their amount of screen time and perceived it as high. However, participants who were not tracking all of their app use data expressed surprise, particularly at data on notifications and pickups. The reactions of participants to their screen time were dependent on the purpose of their phone use. Negative reactions emerged when participants felt that their high phone use resulted in distraction from other activities, particularly when the use centered on entertainment and social media apps. Participants also felt high phone use negatively impacted their mental health, particularly in the form of social comparisons experienced while using social media:
“Checking social media can be a drag on mental health when you see that other people have like such a great life compared to you because of selective sharing on social media”.
(Participant 7)
Participants with negative reactions toward their high screen time reported attempts to reduce it. Ways to limit use of certain apps included turning off notifications, setting time limits, and deleting apps. Participants reported reduced commitment to these changes over time, checked the apps the same amount or even more with the notifications turned off, found ways to get around self-imposed time limits, and redownloaded the deleted apps.
Preliminary implications for mental health app design: Participants reported high phone screen time and large quantities of notifications, yet had varying reactions to this high level of phone use, depending on the context and purpose of it. App developers should consider and promote how their mental health app fits into phone use in ways that are perceived by users as positive.

3.2.2. Impact of Phone Organization on Phone Use

Participants organized the apps on their phones to impact their use of these tools. Participants deliberately customized their home screens to prioritize the apps they used the most and the apps they desired to use more, as exemplified by this participant:
“I try and keep my calendar, my school stuff closer towards the bottom to influence me to get back to work”.
(P4)
Additionally, participants organized their phones to strategically dissuade themselves from using the apps they wanted to limit use of, primarily entertainment and social media apps:
“On my front screen I have no social media apps. I used to have that before and then I was like, this is just a waste of time. If I have to go, because I know that if I have to actually go and slide all the way through everything to find it, then I’m not going to use it as often”.
(P12)
Tactics for organizing apps included placing prioritized apps within the first few home screens, and within or outside of folders. On the other hand, participants explained that the layout of their phone was not geared toward prioritizing certain apps and therefore did not affect their phone use patterns. The participants in this category included those who did not change their home screens from the factory settings, those who designed their phone layouts for aesthetics rather than functionality (e.g., organized by color), and those who preferred to open apps using the search bar in the app library rather than clicking the app icons on their home screens. For example, among the iPhone users who based their phone layouts on aesthetics, the release of iOS 14 (Apple Inc. Cupertino, USA) was marked as the turning point because of the artistic customization trends that proliferated when widgets became available. Those who used the search bar to open apps instead of navigating via app icons viewed this process as more intentional phone use:
“[I] swipe down and then type into the bar at the top because I also read somewhere that that’s apparently, like you’re more intentional, if you have to physically type it out”.
(P1)
Preliminary implications for mental health app design: Users organize their phones idiosyncratically. People invited to use mental health apps should be encouraged to consider where they would place an app for it to be most beneficial.

3.2.3. Using Apps for Mental Health Purposes

Participants’ perceptions of what it means to use an app for their mental health were dependent on their different definitions of mental health. Participants conceptualized “mental health separately from mental illness” (P1) as having physical elements including “brain chemistry” and connections to “physical health” (P2) and as related to well-being “emotionally, spiritually” (P11). Participants connected mental health to the content of their thoughts: “A relatively stable positive like sense of where you are in life, you know, how you feel about yourself, how you feel about the people around you” (P7). Participants associated mental health with actions such as “cleaning and maintaining and upkeeping that headspace” (P10) and “having a plan or ways to feel better” (P3).
Participants discussed using a wide variety of apps for meeting their mental health needs; however, the majority of these apps were not designed specifically to address mental health concerns. Participants used entertainment apps, such as games, TikTok, YouTube, ESPN (an American cable television network), and Pinterest to deliberately disengage from difficult emotions and reduce stress. Participants framed this phone use as purposeful coping, describing it as “very good to temporarily get your mind off of something” (P4) and to “feel more regulated mentally and emotionally” (P9). Participants also used apps as informal self-help tools. For instance, participants reported using apps that provided positive affirmations and bible verses, meditation videos from YouTube, and calming music from Spotify. Participants followed mental wellness content creators on YouTube and Instagram. Journaling and notes apps were used by participants to process thoughts and write down mental health tips they wanted to remember.
Participants described the importance of using text messages and Snapchat to reach out to friends when they needed support. Participants also sought mental health support by engaging with communities on social media platforms, watching YouTube videos of people sharing their mental health related stories, and reading mental health blogs.
Apps reported by participants as focusing on mental health included apps for general mental well-being, self-help for mental health concerns, mindfulness meditation, peer support, and mood tracking. Meditation apps such as Calm, Headspace, and Ten Percent Happier were popular. Participants also used apps with a mood tracking component including Reflectly, Daylio, Flo, and Lift. Within Flo and Lift, participants also described engaging with peer support communities.
Preliminary implications for mental health app design: Users have various definitions for mental health and use a variety of techniques to help them cope. Many of the apps and activities reported would not be considered as focusing on clinical mental health problems. App designers need to consider how mental health support activities (e.g., meditation, religious practices) can or should be included in programs addressing traditional mental health problems (e.g., anxiety, depression, eating disorders) as well as how potential users understand the purpose of a given app.

3.2.4. Barriers to Regular Use of Mental Apps

Although participants had used apps specifically designed for mental health concerns within the last year, they did not report consistent use of these apps at the time of the interviews. Participants explained that they did not want or need to use a mental health app regularly because of other systems they had in place to manage their mental health concerns.
“I just got into the routine of my boyfriend sleeping with me and like I wasn’t going to play an app with him there, there’s just no point”.
(P2)
They also voiced apprehension about investing time and sometimes money into apps, especially those that they were not confident would help them. Even participants who expressed finding a good fit with a mental health app and high levels of motivation to use it did not report consistent use. They explained that the idea of incorporating a mental health app into their regular phone use routine did not align with their desire to only use the app when they perceived a direct need for it:
“I do use Calm, not on a daily basis or something, but more if I’m like having a specific incident with anxiety, I will use it to calm me down”.
(P7)
Preliminary implications for mental health app design: Issues of time, convenience, cost, just-in-time or as needed use, and concerns about efficacy need to be addressed up front in mental health apps.

3.2.5. Reactions and Behavioral Responses to Notifications

In assessing for the impact of notifications as a facet of digital overload, one theme that was found related to specific responses participants had to notifications on their phones. Participants were aware of their screen times but expressed surprise about how many notifications they receive. The reactions participants had to their notifications varied. Participants discussed feeling overwhelmed due to the impact of many notifications accumulating. On the other hand, participants considered their high volume of notifications to be appropriate or expected and reacted neutrally. Participants also mentioned they had disabled notifications they did not need or want. Participants who did not limit notifications expressed irritation with notifications that they “really don’t need” (P6) or those that “can be pretty excessive and distract me” (P9). Examples of notifications falling in these categories included unsolicited notifications from social media apps about what the people they follow are doing on their accounts and notifications regarding deals from food delivery and rideshare apps.
In addition to factors related to the quantity and type of notifications, participants’ reactions to notifications were often dependent on context. For example, participants mentioned that notifications became overwhelming when they were already feeling stressed or anxious due to other commitments. Participants described dissatisfaction with notifications perceived as disruptive to other activities:
“When I got emails in my notifications, it could make it so that when I was trying to enjoy myself, like it just was like obstructing”.
(P7)
“If a notification interrupts me doing something else, I don’t look at it... I’m irritated, because I’m like focused on a thing and it’s in the way now”.
(P10)
Participants mentioned that they experienced excitement about notifications related to social interaction.
Participants reported strategies for managing notifications depending on their reactions, the types of notifications, and the contexts in which they were received. For instance, participants disabled notifications, either by turning their phones on the “Do Not Disturb” feature or silencing undesired notifications for specific apps. Turning off distracting notifications was perceived as having mixed effects. For example, participants reported checking apps without notifications more frequently after having disabled the notifications:
“I think that not having them [notifications] is the cause of me checking it more”.
(P3)
“I feel like I check it often, if not more, because I’m like, I wonder if someone sent me something”.
(P6)
For the notifications that participants had enabled, the highest priority notifications were direct communications with close others, such as text messages from a family member or friend. On the low end of priority were app-based automated notifications. Participants stated that they disliked these notifications because it felt clear that these notifications were sent to persuade them to open the apps. These participants reported preferring the autonomy of opening an app when they decided to, not when an app told them to.
In addition to the importance of the source of the notification, the context in which it was received mattered. If a notification, even a high priority one, interrupted something perceived as important, then participants ignored it completely or read it but did not open the app.
Participants’ opinions varied drastically regarding whether they thought notifications would help increase engagement with mental health apps. Participants who reported wanting notifications from mental health apps suggested that they be informative (provide tips and reminders), graphically interesting, and personalized. The preferred timing and frequency of notifications also varied. On one hand, participants wanted automatic notifications once or twice a day, while others wanted to customize the time and frequency, and other participants found this kind of customization impractical.
Participants discussed how the success of notifications ultimately rested on their desire to use the app in the first place. They shared that, if they wanted to use a mental health app regularly, notifications would likely succeed in reminding them to use it. However, if they were not motivated to use the app, they would likely ignore the notifications or delete the app altogether out of frustration from unwanted notifications.
Preliminary implications for mental health app design: Users should have a choice about whether notifications are used and some choice as to the type, style, and timing of the notifications. Findings also suggest that addressing motivation to use a mental health app through means beyond the external pull of notifications is important.

3.2.6. Suggestions for Improving Integration and Engagement of Mental Health Apps

In addition to providing feedback on notifications, participants outlined other features of mental health apps that could encourage use. Participants recommended that a mental health app have multiple functions, even things unrelated to mental health, to more seamlessly integrate it into everyday life and decrease the number of apps checked daily:
“Probably something that does multiple things. Like if it were like a mental health app, it would I don’t know, have a weather component or something […] little things like that I think are like a lot easier to then incorporate into your schedule”.
(P10)
To this end, participants indicated that a mental health app could connect with other apps that they already regularly use, which could make the mental health app more easily integrated into their routines. However, by contrast, other participants suggested that a mental health app should have a single specific purpose, as that would warrant taking the time to use it:
“Having a direct purpose for the application would make it easy to justify taking away time from something else”.
(P12)
Usability factors were highlighted, such as mental health apps that “have easy navigation” (P12), are “pretty looking” (P1), and can be “customized” (P9) and “tailored” (P5). In terms of content, participants emphasized the importance of variety so that use of the app can be tailored to different situations and purposes. This included variety in the type of content and length of activities, as well as the recommendation that new content be introduced regularly:
“Something that does motivate me to use it, is the fact that there’s like basically a meditation for any kind of mood that I’m in. And for whatever length I need… I know that they’re going to keep adding things which makes it like worth continuing to engage with it”.
(P1)
Peer support or some kind of connection to other people was also important to participants:
“I think one thing that mental health apps lack... is some sort of like, that idea of interacting with other people”.
(P4)
In referencing apps that they use frequently, including Snapchat, Duolingo, and Notion, participants mentioned that “streaks” (P6), a “visual representation” of consecutive use (P3), or an “Achievements” feature (P1) to encourage everyday use would be helpful.
Preliminary implications for mental health app design: Ideas to consider in developing mental health apps are to design them to have multiple functions and to interact with other apps, look good, be easy to navigate, be able to be tailored and customized, and involve peers.

4. Discussion

The present study used semi-structured interviews to examine notification and general phone use preferences that may affect mental health app engagement among college students in order to identify potential strategies for addressing the engagement issues commonly encountered in digital mental health research. Our findings provide a nuanced account of college students’ general phone use, phone organization, experiences with notifications, use of apps for mental health purposes, integration of mental health apps, and suggestions for improving integration and engagement.
Our first goal was to obtain information on “digital overload.” Similar to other studies (Roberts et al. 2014; Andrews et al. 2015; David et al. 2018; Ataş and Çelik 2019), we found that college students spent a lot of time on their phones—almost nine hours a day, on average. We also found that participants received an average of about 260 notifications and picked up their device an average of about 108 times every day. While this high level of activity would seem to reflect objective “overload” as it is commonly conceptualized, in fact the participants seemed, for the most part, comfortable living in this high digital use space and did not report issues of information anxiety, avoidance, or withdrawal—the definition of digital overload used by others (Bawden and Robinson 2009). That said, a range of behaviors, reactions, and management strategies in response to notifications were brought up by participants, but not necessarily in the context of an “overload” of digital information or in a way that produced the expected digital overload responses. Our findings are thus in line with other research, which suggested that the related concept of digital stress is a subjective experience of a perceived stressor, such as a given quantity of notifications, that varies in accordance with perceived coping resources and relational contexts (Steele et al. 2020). Future research should carefully differentiate between objective and subjective experiences of digital overload and related concepts, since clarifying the definitions may be important for understanding outcomes. LaRose and colleagues (2014) elucidated this point, given that a model taking subjective experiences of connection overload into account produces a positive correlation with negative affect, whereas there is a negative correlation between objective units of connection demands and negative affect.
Notably, others who have applied the Bawden and Robinson (2009) definition of information overload to digital contexts have called for refinement in terms. Lehman and Miller (2020) explained that “overload has been used as a catch-all for individual problems with levels of information.” They have suggested that various terms, which “categorize the contexts in which individuals struggle with information,” including digital overload, be more solidly defined and adopted (Lehman and Miller 2020). Importantly, this study suggests that generational differences be considered in the process of revising the definition of digital overload. It is possible that, since college students of today became phone owners at younger ages compared to older generations, college students have adapted uniquely to processing digital information. While managing a large amount of digital information may provoke information anxiety, avoidance, or withdrawal for older generations, quantity alone did not elicit these adverse outcomes among participants in this study. This is consistent with other research, in which younger internet users were found to be less susceptible to the stress, burnout, depression, and anxiety associated with communication load than the above 50 age group (Reinecke et al. 2017). Other factors such as the content and source of notifications, as well as the context in which they are received, appear to be more influential in dictating the emotional and behavioral responses for this population of digital natives, and should therefore be considered when updating the definition of digital overload with younger generations in mind.
A related aim of this study was to determine if notifications should be implemented to improve user engagement with mental health apps, or if this would contribute to digital overload and dissuade use. We anticipated that the majority of participants with a high volume of notifications would report experiencing digital overload; however, participants did not universally associate a feeling of anxiety or overwhelm with a high volume of notifications. Rather, their reactions to notifications were dependent upon the content of the notification, the source or app it came from, and the context in which it was received. For example, some participants reported distress from receiving successive notifications compiling to the point of them feeling unmanageable, but others did not report this causing any negative emotions. Participants also mentioned feeling overwhelmed, not due to the volume of notifications themselves, but because of situational factors that impeded their ability to attend to them. Steele and colleagues (2020) have similarly highlighted that individuals’ experiences of digital stress is expected to vary depending on their social and relational contexts. In the present study, most notifications were ignored if the message or source was not of interest. It is unlikely that adding notifications to mental health apps would increase engagement if users are not motivated to attend to the notifications in the first place. For users that are motivated to attend to them, thoughtful notifications may aid in increasing mental health app engagement, but further research on user preferences and how to implement them is necessary. Nevertheless, and perhaps more importantly, mental health programs need to compete with the many hundreds of other apps, services, notifications, and activities that college students use/receive.
Our third goal was to obtain information from college students about how mental health digital program use can be improved, with a particular goal of informing whether changes to the design of notifications would be helpful. We expected that digital overload from notifications would be a barrier to engagement with mental health apps. In line with other research, participants reported disengaging from apps at least in part because of frustration with a high volume of unwanted notifications (Vaghefi and Tulu 2019). Nevertheless, this was not a universal cause of disengagement. Unwanted notifications from mental health apps were ignored, and participants disengaged for other reasons ranging from poor app fit to internal barriers that impaired motivation. Additionally, participants disengaged because they did not want to use another app along with apps already being used in their regular phone routines. Suggestions to address this barrier included designing mental health apps with multiple functions in order to reduce the total number of apps participants need to use regularly, or having a mental health function integrated into another app they already use. For example, this could take the form of having mental health chatbots embedded in messaging and social media apps. Since all users have pre-existing preferences and individualized needs, customizability of app features, including but not limited to notifications, is likely to improve use, as suggested by Melcher et al. (2020). In addition to customization within the app itself, app developers should encourage potential users to consider how they could best incorporate the app into their digital spaces and routines to further increase the likelihood of use. For example, participants in this study highlighted individualized strategies for phone organization to prioritize a mental health app, time of day and frequency preferences for mental health app use, and behavior changes aimed to increase use.
An important takeaway from this study is that there are major discrepancies between researchers’ and consumers’ preferences for, and understanding of, using smartphone apps for mental health purposes. The eligibility criteria for this study required that participants endorse having used a smartphone app designed to address mental health concerns within the last year. However, when asked, “What apps do you use for your mental health?”, no participants began by discussing their experiences with specialized, evidence-based mental health apps. Instead, participants described using games, entertainment apps, and social media apps to deliberately distract from negative emotions. Smartphone use as a means of avoidance coping is associated with depression and anxiety (Panova and Lleras 2016). Yet, participants’ responses highlight how gamification and social elements could be utilized to improve user satisfaction with mental health apps.
When participants were subsequently prompted to describe their experiences with apps specifically designed for addressing mental health concerns (e.g., apps for general mental well-being such as meditation apps, app-based therapy or guided mental health interventions, self-help apps for a specific mental health concern, automated mental health chatbots, app-based peer support, mood trackers), participants were able to identify an app that fits into one of these categories and share their experiences with it, but they were primarily mindfulness meditation apps, mental health peer support apps, and mood trackers. Cognitive behavioral therapy interventions are recommended for use in mental health apps (Bakker et al. 2016), yet participants in this study did not report using an app with a robust CBT basis. Lift, with its psychoeducation modules, behavior change exercises, self-monitoring logs, and optional coaching component, was the app mentioned in the interviews that most resembles CBT-based (guided) self-help. However, although Lift is based on tested strategies, to our knowledge, the efficacy of Lift has not been empirically examined. Similarly, the majority of mental health apps on the market have not been empirically tested (Neary and Schueller 2018), but this is not entirely surprising as evidence-base and users’ perceptions of usefulness are distinct (Schueller et al. 2018). Considerations of how users’ perceptions of usefulness drive engagement are perhaps as important as establishing the evidence-base of mental health apps, if the goal is to encourage people to actually use the empirically supported apps.
Users’ highly varied personal preferences and the discrepancies between researchers’ and users’ perceptions are not insurmountable barriers to improving engagement. Rather, these user preferences and perceptions need to be considered in the design process because what is validated in the laboratory does not necessarily translate to user interfaces, user experiences on mobile apps, or user lived experiences (Sobolev et al. 2021). Participatory design (Muller and Kuhn 1993), which involves all stakeholders in an iterative process to address everyone’s needs, may be one way to bridge the gap between researchers and users of technology-based mental health interventions (Orlowski et al. 2015). The gap between research and practice can occur when evidence-based interventions do not necessarily fit the needs of the populations they are intended to reach, so enhancing the role that target populations play in the design process is an important aspect of improving the dissemination and implementation of health interventions (Chambers 2020). Digital mental health researchers should consider how to harness users’ creativity and input, both in guiding one’s personal decisions about app integration and in assisting with the design of apps themselves, as a primary strategy for improving mental health app engagement.
The exploratory nature of this semi-structured interview study produced a rich qualitative dataset of participants’ unique experiences. Nevertheless, there are limitations to the methods we used that impact the interpretation of the results. Due to the variety of recruitment materials distributed with the assistance of students at multiple universities, we were unable to report how many potential participants were reached beyond the 71 who consented to the online screening. Although we used purposeful sampling to maximize the representativeness of our sample, the themes generated from the 12 semi-structured interviews that we analyzed cannot be generalized to all college students, and further research is also needed with populations other than U.S. college students, therefore we accordingly labelled our implications for mental health app design as “preliminary.” Additionally, we did not ask participants for information about their socioeconomic status and there were many more iPhone users than Android users. It is important to note, however, that the iPhone abundance is not surprising given that the vast majority of Generation Z, which includes college-aged students, own and prefer iPhones over Android devices (Piper Sandler Companies 2020). We also did not ask participants prior to the interviews which apps specifically designed for mental health they had used, so we encountered a saturation of mindfulness meditation apps. Future research should include larger sample sizes that account for socioeconomic status, device type, and mental health app category. Regarding mental health app categories, however, the results of this study suggest that the understanding between researchers and consumers of mental health apps do not always align, therefore current distinctions between mental health app categories may need to be re-evaluated. Furthermore, although participants primarily referenced mindfulness apps, the preliminary implications for mental health app design may be generalizable to many types of apps that include mental health and well-being content, given that apps ranging from Calm to PTSD Coach fall under the Health & Fitness category in both the Apple App Store and Google Play Store.
This study provides a nuanced look into college students’ preferences for integrating mental health apps into their consistent phone use that are important to consider when developing strategies to increase engagement. Notably, notifications from mental health apps may contribute to digital overload that dissuades use, but participants disengaged for numerous other reasons including dissatisfaction with the apps themselves, internal barriers to use, different priorities in phone use, and competing demands in life. Through the identification of these influences, this study demonstrates that phone use factors as well as broader considerations impact users’ integration of mental health apps. The comprehensive suggestions that participants provided for improving mental health app integration and engagement demonstrates the utility of including target users in the design process. Further research on mental health app engagement that takes user preferences into account is an essential step toward the development and dissemination of efficacious, cost-effective, and widely available digital interventions that college students burdened by mental health concerns can truly benefit from.

Supplementary Materials

The following are available online at https://www.mdpi.com/article/10.3390/socsci10080279/s1, Supplementary File S1: Interview Guide and Thematic Analysis.

Author Contributions

Conceptualization, A.C.S., L.A.F., A.K.G., B.K.J., M.-L.F., G.E.M., M.M.V., B.D., S.S.-S., K.N.B., N.T., D.E.W., C.B.T., E.E.F.-C.; methodology, A.C.S., L.A.F., A.K.G., B.K.J., M.-L.F., G.E.M., M.M.V., B.D., S.S.-S., K.N.B., C.B.T., E.E.F.-C.; validation, A.C.S., L.A.F., M.-L.F.; formal analysis, A.C.S., L.A.F.; investigation, A.C.S; writing—original draft preparation, A.C.S., L.A.F.; writing—review and editing, A.C.S., L.A.F., A.K.G., B.K.J., M.-L.F., G.E.M., M.M.V., B.D., S.S.-S., K.N.B., N.T., D.E.W., C.B.T., E.E.F.-C.; supervision, L.A.F., A.K.G., B.K.J., S.S.-S., K.N.B., N.T., D.E.W., C.B.T., E.E.F.-C.; project administration, A.C.S., G.E.M.; funding acquisition, L.A.F., A.K.G., K.N.B., N.T., D.E.W., C.B.T., E.E.F.-C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Institute of Mental Health, grant number R01 MH115128; the National Health and Medical Research Council, grant number NHMRC APP1170937; the National Institutes of Health, grant number K08 MH120341 (E.E.F.-C.); the Na-tional Institutes of Health, grant number K01 DK116925 (A.K.G.); the National Institutes of Health, grant number K01 DK120778 (K.N.B.); the National Heart, Lung, and Blood Institute, grant number T32 HL130357 (L.A.F.); the Swedish Research Council grant number, 2018-06585 (N.T.).

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki, and approved by the Institutional Review Board (or Ethics Committee) of Washington University in St. Louis (IRB ID #202009046, approved on 09 September 2020).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available in the article and supplementary materials.

Acknowledgments

Thank you very much Layna Paraboschi and Anneliese Haas for their assistance with de-identifying and proofreading the interview transcripts.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ahuvia, Isaac L., Jenna Y. Sung, Mallory L. Dobias, Brady D. Nelson, Lauren L. Richmond, Bonita London, and Jessica L. Schleider. 2021. College Student Interest in Teletherapy and Self-Guided Mental Health Supports During the COVID-19 Pandemic. PsyArXiv Preprints. Available online: https://psyarxiv.com/8unfx/ (accessed on 21 May 2021).
  2. American College Health Association. 2019. National College Health Assessment: Spring 2019 Reference Group Data Report. American College Health Association. Available online: https://www.acha.org/NCHA/ACHA-NCHA_Data/Publications_and_Reports/NCHA/Data/Reports_ACHA-NCHAIIc.aspx (accessed on 21 May 2021).
  3. Andrews, Sally, David A. Ellis, Heather Shaw, and Lukasz Piwek. 2015. Beyond Self-Report: Tools to Compare Estimated and Real-World Smartphone Use. PLoS ONE 10: 0139004. [Google Scholar] [CrossRef] [Green Version]
  4. Ataş, Amine Hatun, and Berkan Çelik. 2019. Smartphone Use of University Students: Patterns, Purposes, and Situations. Malaysian Online Journal of Educational Technology 7: 54–70. [Google Scholar] [CrossRef]
  5. Bakker, David, Nikolaos Kazantzis, Debra Rickwood, and Nikki Rickard. 2016. Mental health smartphone apps: Review and evidence-based recommendations for future developments. JMIR Mental Health 3: 7:1–7:31. [Google Scholar] [CrossRef] [Green Version]
  6. Baumel, Amit, Frederick Muench, Stav Edan, and John M. Kane. 2019. Objective User Engagement with Mental Health Apps: Systematic Search and Panel-Based Usage Analysis. Journal of Medical Internet Research 21: 14567:1–14567:15. [Google Scholar] [CrossRef] [Green Version]
  7. Bawden, David, and Lyn Robinson. 2009. The Dark Side of Information: Overload, Anxiety and Other Paradoxes and Pathologies. Journal of Information Science 35: 180–91. [Google Scholar] [CrossRef]
  8. Braun, Virginia, and Victoria Clarke. 2006. Using Thematic Analysis in Psychology. Qualitative Research in Psychology 3: 77–101. [Google Scholar] [CrossRef] [Green Version]
  9. Caine, Kelly. 2016. Local Standards for Sample Size at CHI. Paper presented at CHI: Conference on Human Factors in Computing Systems, San Jose, CA, USA, April 28–May 3; pp. 981–92. [Google Scholar] [CrossRef] [Green Version]
  10. Center for Collegiate Mental Health. 2021. 2020 Annual Report. CCMH Annual Reports. Available online: https://ccmh.psu.edu/assets/docs/2020%20CCMH%20Annual%20Report.pdf (accessed on 21 May 2021).
  11. Chambers, David A. 2020. Sharpening our Focus on Designing for Dissemination: Lessons from the SPRINT Program and Potential Next Steps for the Field. Translational Behavioral Medicine 10: 1416–18. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  12. Cohen, Katherine A., Andrea K. Graham, and Emily G. Lattie. 2020. Aligning Students and Counseling Centers on Student Mental Health Needs and Treatment Resources. Journal of American College Health 2020: 1–19. [Google Scholar] [CrossRef] [PubMed]
  13. Cohen, Katherine A., Colleen Stiles-Shields, Nathan Winquist, and Emily G. Lattie. 2021. Traditional and Nontraditional Mental Healthcare Services: Usage and Preferences Among Adolescents and Younger Adults. The Journal of Behavioral Health Services & Research 2021: 1–17. [Google Scholar] [CrossRef]
  14. David, Meredith E., James A. Roberts, and Brett Christenson. 2018. Too Much of a Good Thing: Investigating the Association between Actual Smartphone Use and Individual Well-Being. International Journal of Human–Computer Interaction 34: 265–75. [Google Scholar] [CrossRef]
  15. Fitzsimmons-Craft, Ellen E., C. Barr Taylor, Michelle G. Newman, Nur Hani Zainal, Elsa Rojas-Ashe, Sarah Ketchen Lipson, Marie-Laure Firebaugh, Peter Ceglarek, Naira Topooco, Nicholas C. Jacobson, and et al. 2021. Harnessing Mobile Technology to Reduce Mental Health Disorders in College Populations: A Randomized Controlled Trial Study Protocol. Contemporary Clinical Trials 103: 106320:1–106320:10. [Google Scholar] [CrossRef]
  16. Guest, Greg, Arwen Bunce, and Laura Johnson. 2006. How Many Interviews Are Enough?: An Experiment with Data Saturation and Variability. Field Methods 18: 59–82. [Google Scholar] [CrossRef]
  17. Guest, Greg, Emily Namey, and Mario Chen. 2020. A Simple Method to Assess and Report Thematic Saturation in Qualitative Research. PLoS ONE 15: 0232076. [Google Scholar] [CrossRef]
  18. Hannah, David R., and Brenda A. Lautsch. 2011. Counting in Qualitative Research: Why to Conduct It, When to Avoid It, and When to Closet It. Journal of Management Inquiry 20: 14–22. [Google Scholar] [CrossRef]
  19. Johnson, Kaprea F., and Michael T. Kalkbrenner. 2017. The Utilization of Technological Innovations to Support College Student Mental Health: Mobile Health Communication. Journal of Technology in Human Services 35: 314–39. [Google Scholar] [CrossRef]
  20. Kern, Adam, Victor Hong, Joyce Song, Sarah Ketchen Lipson, and Daniel Eisenberg. 2018. Mental Health Apps in a College Setting: Openness, Usage, and Attitudes. MHealth 2018: 4. [Google Scholar] [CrossRef]
  21. LaRose, Robert, Regina Connolly, Hyegyu Lee, Kang Li, and Kayla D. Hales. 2014. Connection Overload? A Cross Cultural Study of the Consequences of Social Media Connection. Information Systems Management 31: 59–73. [Google Scholar] [CrossRef]
  22. Lattie, Emily G., Sarah Ketchen Lipson, and Daniel Eisenberg. 2019. Technology and College Student Mental Health: Challenges and Opportunities. Frontiers in Psychiatry 10: 246:1–246:5. [Google Scholar] [CrossRef] [Green Version]
  23. Lehman, Amanda, and Sophie Jo Miller. 2020. A Theoretical Conversation about Responses to Information Overload. Information 11: 379. [Google Scholar] [CrossRef]
  24. Lipson, Sarah Ketchen, Emily G. Lattie, and Daniel Eisenberg. 2019. Increased Rates of Mental Health Service Utilization by U.S. College Students: 10-Year Population-Level Trends (2007–2017). Psychiatric Services 70: 60–63. [Google Scholar] [CrossRef]
  25. Melcher, Jennifer, Erica Camacho, Sarah Lagan, and John Torous. 2020. College Student Engagement with Mental Health Apps: Analysis of Barriers to Sustained Use. Journal of American College Health 2020: 1–7. [Google Scholar] [CrossRef] [PubMed]
  26. Misra, Shalini, and Daniel Stokols. 2012. Psychological and Health Outcomes of Perceived Information Overload. Environment and Behavior 44: 737–59. [Google Scholar] [CrossRef] [Green Version]
  27. Montagni, Ilaria, Christophe Tzourio, Thierry Cousin, Joseph Amadomon Sagara, Jennifer Bada-Alonzi, and Aine Horgan. 2020. Mental Health-Related Digital Use by University Students: A Systematic Review. Telemedicine and e-Health 26: 131–46. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  28. Muller, Michael J., and Sarah Kuhn. 1993. Participatory Design. Communications of the ACM 36: 24–28. [Google Scholar] [CrossRef]
  29. Neary, Martha, and Stephen M. Schueller. 2018. State of the Field of Mental Health Apps. Cognitive and Behavioral Practice 25: 531–37. [Google Scholar] [CrossRef]
  30. Orlowski, Simone Kate, Sharon Lawn, Anthony Venning, Megan Winsall, Gabrielle M. Jones, Kaisha Wyld, Raechel A. Damarell, Gaston Antezana, Geoffrey Schrader, David Smith, and et al. 2015. Participatory Research as One Piece of the Puzzle: A Systematic Review of Consumer Involvement in Design of Technology-Based Youth Mental Health and Well-Being Interventions. JMIR Human Factors 2: 12:1–12:21. [Google Scholar] [CrossRef] [Green Version]
  31. Oswalt, Sara B., Alyssa M. Lederer, Kimberly Chestnut-Steich, Carol Day, Ashlee Halbritter, and Dugeidy Ortiz. 2020. Trends in College Students’ Mental Health Diagnoses and Utilization of Services, 2009–2015. Journal of American College Health 68: 41–51. [Google Scholar] [CrossRef]
  32. Palinkas, Lawrence A., Sarah M. Horwitz, Carla A. Green, Jennifer P. Wisdom, Naihua Duan, and Kimberly Hoagwood. 2015. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research. Administration and Policy in Mental Health and Mental Health Services Research 42: 533–44. [Google Scholar] [CrossRef] [Green Version]
  33. Panova, Tayana, and Alejandro Lleras. 2016. Avoidance or Boredom: Negative Mental Health Outcomes Associated with Use of Information and Communication Technologies Depend on Users’ Motivations. Computers in Human Behavior 58: 249–58. [Google Scholar] [CrossRef]
  34. Patton, Michael Quinn. 2002. Qualitative Research and Evaluation Methods, 3rd ed. Thousand Oaks: Sage Publications. [Google Scholar]
  35. Pew Research Center. 2021. Mobile Fact Sheet. Pew Research Center. Available online: https://www.pewresearch.org/internet/fact-sheet/mobile/ (accessed on 21 May 2021).
  36. Piper Sandler Companies. 2020. Taking Stock with Teens: Fall 2020 Survey. Piper Sandler. Available online: https://www.pipersandler.com/3col.aspx?id=6039 (accessed on 21 May 2021).
  37. Reinecke, Leonard, Stefan Aufenanger, Manfred E. Beutel, Michael Dreier, Oliver Quiring, Birgit Stark, Klaus Wölfling, and Kai W. Müller. 2017. Digital Stress over the Life Span: The Effects of Communication Load and Internet Multitasking on Perceived Stress and Psychological Health Impairments in a German Probability Sample. Media Psychology 20: 90–115. [Google Scholar] [CrossRef]
  38. Renjith, R. 2017. The Effect of Information Overload in Digital Media News Content. Communication and Media Studies 6: 73–85. [Google Scholar]
  39. Roberts, James. A., Luc Honore Petnji Yaya, and Chris Manolis. 2014. The Invisible Addiction: Cell-Phone Activities and Addiction among Male and Female College Students. Journal of Behavioral Addictions 3: 254–65. [Google Scholar] [CrossRef] [Green Version]
  40. Saunders, Benjamin, Julius Sim, Tom Kingstone, Shula Baker, Jackie Waterfield, Bernadette Bartlam, Heather Burroughs, and Clare Jinks. 2018. Saturation in Qualitative Research: Exploring its Conceptualization and Operationalization. Quality & Quantity 52: 1893–907. [Google Scholar] [CrossRef]
  41. Schueller, Stephen M., Martha Neary, Kristen O’Loughlin, and Elizabeth C. Adkins. 2018. Discovery of and Interest in Health Apps Among Those with Mental Health Needs: Survey and Focus Group Study. Journal of Medical Internet Research 20: 10141:1–10141:10. [Google Scholar] [CrossRef]
  42. Sobolev, Michael, Rachel Vitale, Hongyi Wen, James Kizer, Robert Leeman, J. P. Pollak, Amit Baumel, Nehal P. Vadhan, Deborah Estrin, and Frederick Muench. 2021. The Digital Marshmallow Test (DMT) Diagnostic and Monitoring Mobile Health App for Impulsive Behavior: Development and Validation Study. JMIR mHealth and uHealth 9: 25018:1–25018:21. [Google Scholar] [CrossRef]
  43. Steele, Ric G., Jeffrey A. Hall, and Jennifer L. Christofferson. 2020. Conceptualizing Digital Stress in Adolescents and Young Adults: Toward the Development of an Empirically Based Model. Clinical Child and Family Psychology Review 23: 15–26. [Google Scholar] [CrossRef] [PubMed]
  44. Vaghefi, Isaac, and Bengisu Tulu. 2019. The Continued Use of Mobile Health Apps: Insights from a Longitudinal Study. JMIR mHealth and uHealth 7: 12983:1–12983:1. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Table 1. Participant Characteristics.
Table 1. Participant Characteristics.
Participant IDGenderRace and EthnicityAgeUniversityYear in School
P1FemaleAsian or Asian American, White, Native Hawaiian or Other Pacific Islander20Private Four-YearYear 3
P2FemaleAmerican Indian or Alaskan Native, Hispanic or Latino20Public Four-Year Year 3
P3FemaleAsian or Asian American20Public Four-Year Year 3
P4MaleWhite18Private Four-Year Year 1
P5FemaleAsian or Asian American20Private Four-Year Year 3
P6MaleWhite19Public Four-Year Year 3
P7MaleWhite19Private Four-Year Year 2
P8MaleAsian or Asian American, White19Private Four-Year Year 2
P9FemaleBlack or African American18Private Four-Year Year 1
P10FemaleBlack or African American21Private Four-Year Year 4
P11MaleBlack or African American20Private Four-YearYear 2
P12FemaleBlack or African American20Private Four-Year Year 3
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Smith, A.C.; Fowler, L.A.; Graham, A.K.; Jaworski, B.K.; Firebaugh, M.-L.; Monterubio, G.E.; Vázquez, M.M.; DePietro, B.; Sadeh-Sharvit, S.; Balantekin, K.N.; et al. Digital Overload among College Students: Implications for Mental Health App Use. Soc. Sci. 2021, 10, 279. https://doi.org/10.3390/socsci10080279

AMA Style

Smith AC, Fowler LA, Graham AK, Jaworski BK, Firebaugh M-L, Monterubio GE, Vázquez MM, DePietro B, Sadeh-Sharvit S, Balantekin KN, et al. Digital Overload among College Students: Implications for Mental Health App Use. Social Sciences. 2021; 10(8):279. https://doi.org/10.3390/socsci10080279

Chicago/Turabian Style

Smith, Arielle C., Lauren A. Fowler, Andrea K. Graham, Beth K. Jaworski, Marie-Laure Firebaugh, Grace E. Monterubio, Melissa M. Vázquez, Bianca DePietro, Shiri Sadeh-Sharvit, Katherine N. Balantekin, and et al. 2021. "Digital Overload among College Students: Implications for Mental Health App Use" Social Sciences 10, no. 8: 279. https://doi.org/10.3390/socsci10080279

APA Style

Smith, A. C., Fowler, L. A., Graham, A. K., Jaworski, B. K., Firebaugh, M. -L., Monterubio, G. E., Vázquez, M. M., DePietro, B., Sadeh-Sharvit, S., Balantekin, K. N., Topooco, N., Wilfley, D. E., Taylor, C. B., & Fitzsimmons-Craft, E. E. (2021). Digital Overload among College Students: Implications for Mental Health App Use. Social Sciences, 10(8), 279. https://doi.org/10.3390/socsci10080279

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop