Published on in Vol 6 (2023)

Preprints (earlier versions) of this paper are available at, first published .
Web-Based Cognitive Behavioral Therapy for Depression Among Homebound Older Adults: Development and Usability Study

Web-Based Cognitive Behavioral Therapy for Depression Among Homebound Older Adults: Development and Usability Study

Web-Based Cognitive Behavioral Therapy for Depression Among Homebound Older Adults: Development and Usability Study

Original Paper

1School of Social Work, University of Michigan-Ann Arbor, Ann Arbor, MI, United States

2Department of Psychology, University of Michigan-Ann Arbor, Ann Arbor, MI, United States

3School of Social Work, Columbia University, New York City, NY, United States

4SLAC National Accelerator Laboratory, Stanford University, Menlo Park, CA, United States

5Factor-Inwentash Faculty of Social Work, University of Toronto, Toronto, ON, Canada

6Faculty of Information, University of Toronto, Toronto, ON, Canada

Corresponding Author:

Xiaoling Xiang, MSW, MPhil, PhD

School of Social Work

University of Michigan-Ann Arbor

1080 S University Avenue

Ann Arbor, MI, 48109

United States

Phone: 1 7347636581


Background: Homebound older adults are a high-risk group for depression. However, many of them face barriers to accessing evidence-supported mental health treatments. Digital mental health interventions can potentially improve treatment access, but few web-based interventions are explicitly tailored for depression in older adults.

Objective: This paper describes the development process of Empower@Home, a web-delivered intervention for depression in homebound older adults that is based on cognitive behavioral therapy, and reports on the outcomes of usability studies.

Methods: Empower@Home was developed in collaboration with community agencies, stakeholders, and older adults, guided by user-centered design principles. User needs were assessed through secondary data analysis, demographic and health profiles from administrative data, and interviews and surveys of community partners. A comparative usability evaluation was conducted with 10 older adults to assess the usability of Empower@Home compared to 2 similar programs. Field testing was conducted with 4 end users to detect additional usability issues.

Results: Feedback and recommendations from community partners heavily influenced the content and design of Empower@Home. The intervention consists of 9 sessions, including psychoeducation and an introduction to cognitive behavioral therapy skills and tools through short video clips, in-session exercises, an animated storyline, and weekly out-of-session home practice. A printed workbook accompanies the web-based lessons. In comparative usability testing (N=10), Empower@Home received a System Usability Scale score of 78 (SD 7.4), which was significantly higher than the 2 comparator programs (t9=3.28; P=.005 and t9=2.78; P=.011). Most participants, 80% (n=8), preferred Empower@Home over the comparators. In the longitudinal field test (n=4), all participants reported liking the program procedures and feeling confident in performing program-related tasks. The single-subject line graph showed an overall downward trend in their depression scores over time, offering an encouraging indication of the intervention’s potential effects.

Conclusions: Collaboration with community stakeholders and careful consideration of potential implementation issues during the design process can result in more usable, engaging, and effective digital mental health interventions.

JMIR Aging 2023;6:e47691



Being homebound is often linked with socioeconomic disadvantages, including low income, racial minority status, and high levels of disability. Studies have shown that half of homebound older adults exhibit clinically significant symptoms of depression, with 14% meeting the criteria for current major depression [1,2]. This starkly contrasts the 2% prevalence of major depression in nonhomebound older adults [3]. When left untreated or insufficiently treated, depression can reduce quality of life and increase hospitalizations and early mortality [4]. Despite the availability of evidence-based treatments, traditional office-based services often remain out of reach for homebound older adults due to access barriers such as cost, transportation, and stigma [5]. Insurance coverage options for minor depression are also limited, and few mental health clinicians have received specialized training in working with older adults. It is crucial to find innovative ways to provide evidence-supported psychosocial treatments that are both accessible and cost-effective while reducing reliance on highly trained professionals and ensuring scalability.

Digital mental health interventions (DMHIs) are behavioral and psychological intervention strategies that use technology, such as websites, mobile apps, and other mobile devices, to improve mental health. Internet-based cognitive behavioral therapy, or iCBT, is one of the most studied DMHIs. iCBT is an automated psychotherapy based on cognitive behavioral therapy (CBT) principles delivered via dedicated websites or apps [6]. Patients receive psychoeducational materials through a web platform or a dedicated app and are exposed to the same core components as conventional CBT (eg, behavioral activation and cognitive restructuring). At a reduced marginal cost, iCBT can be used repeatedly with different patients without losing its therapeutic power, making it particularly useful for reducing health disparities in underresourced settings [7]. Studies have shown that iCBT is as effective as face-to-face CBT in treating depression in mixed-age samples [8]. Emerging evidence also supports the potential benefits of iCBT in older adults, including those with a heightened risk of depression [9,10].

However, most iCBT programs have not been specifically tailored to meet the needs of older adults, with only a few exceptions [11-14]. For older adults, this includes procedural and content modifications to CBT that address differences in thinking styles and age-related adjustment [15]. In addition, the user interface (UI) in web-delivered interventions may need to be adjusted to fit the preferences, needs, and capabilities of older adults [16]. Furthermore, we are unaware of such programs in the US market tailored explicitly for homebound older adults. Generic DMHIs can benefit older adults, but we have found that those with complex interfaces often result in low adherence and engagement, limited effects, and a myriad of usability issues among low-income, homebound older adults [10,17]. These individuals are typically less tech-savvy and more sensitive to usability problems.

Our team developed Empower@Home, a web-based psychosocial depression intervention explicitly designed for homebound older adults, to address the shortage of DMHIs tailored to this high-need and underserved population. Empower@Home is a 9-session iCBT program that aims to prevent and reduce the symptoms of depression. The target population is homebound adults aged >60 years (ie, those with mobility difficulties). The intervention development process involved significant stakeholder input and user-centered design principles and occurred alongside academic-community partnership development. In this paper, we describe the process of developing Empower@Home, report on its feasibility and usability evaluation outcomes, and discuss its implications for designing DMHIs that are attuned to the needs of individuals and the characteristics of implementation settings.

The Empower@Home Intervention

Empower@Home includes 9 web-based lessons, each featuring didactic content, in-session exercises, motivational quotes, and an engaging animated story driven by human characters. Table 1 presents an overview of each session. Each lesson is presented in brief videos (less than 2 minutes) to lessen cognitive load (Multimedia Appendix 1). The lessons are arranged in a specific sequence, and each concludes with instructions for home practice. During home practices, users apply the skills they have learned using various program tools. These tools focus on fundamental CBT skills and are grouped into categories: doing tools for behavioral activation and problem-solving, thinking tools for cognitive restructuring, feeling tools for relaxation and mood monitoring, and communication tools for fostering effective communication. In addition, participants do a mood self-check by filling out the Patient Health Questionnaire-9 (PHQ-9) in every other session (sessions 1, 3, 5, 7, and 9) to track their symptoms [18].

Table 1. Empower@Home session-by-session overviewa.
SessionSession contentCBTb elementsHome practice
Session 1: Ready, Set… Go!Session 1 orients the user to the program, delivers psychoeducation about depression and aging, and gently introduces CBT in jargon-free language. The session also includes content to motivate the user to engage with the program and introduces BAc, which is referred to as a doing tool.
  • Psychoeducation
  • Doing tools: activity monitoring
Activity monitoring form
Session 2: Doing ToolsSession 2 is a continued exploration of BA using the value-based BA approach. Major in-session activities include reviewing the activity monitoring form from the last session, charting the depression downward spiral, filling out the values and activities inventory, and creating the “my desired activities” master list. The first mindfulness exercise––Body Scan––is also introduced in this session.
  • Doing tools: value-based behavioral activation; activity scheduling
  • Feeling tools: body scan
Body scan and activity scheduling
Session 3: Working with BarriersSession 3 continues the focus on BA skills by addressing common barriers to BA for older adults. Major in-session activities include practicing breaking things down into small steps, completing the “my desired activities” master list by adding the names of supportive people, and turning “you statements” into “I statements.” This session also discusses the characteristics of effective communication.
  • Doing tools: break down tasks; activity scheduling
  • Communication tools: effective communication and I statements
Activity scheduling
Session 4: Keep DoingSession 4 continues to address common barriers to BA, including unhelpful thoughts (eg, “I can’t do anything”) and physical barriers to doing things. The user learns about adaptive behaviors and behavioral modification methods. Issues like independence and getting help are also discussed. Major in-session exercises include adaptive behavior quizzes, identifying inner strengths, and making adjustments to the “my desired activities” master list. The second mindfulness exercise, called mind-calming, is introduced.
  • Doing tools: adaptive behaviors
  • Thinking tools: unhelpful thoughts related to BA
  • Feeling tools: mind-calming exercise
Mind-calming exercise and activity scheduling
Session 5: Problem SolvingSession 5 provides a 5-step problem-solving technique. The user follows along to practice the technique using their own problem, concluding in an action plan. The second communication tool––active listening––is introduced.
  • Doing tools: 5-step problem-solving
  • Communication tools: active listening
Problem-solving and activity scheduling
Session 6: Unhelpful ThinkingSession 6 is the first of two sessions on cognitive restructuring—another core CBT skill. The user learns about common unhelpful thinking patterns and is asked to identify them in case stories and reflect on their experience. Core beliefs are also introduced.
  • Thinking tools: ABC model; identify unhelpful thoughts and core beliefs.
Thought record and activity scheduling
Session 7: Thinking ToolsSession 7 is the second session on cognitive restructuring and moves from identifying thinking errors to challenging them. The “7-column thought record” is introduced to continue tracking thinking errors and practicing challenging methods.
  • Thinking tools: challenge unhelpful thoughts
Thought record and activity scheduling
Session 8: Feeling ToolsSession 8 discusses various forms of self-care and addresses physical activity and nutrition. The second half of the session introduces mindfulness and walks the users through a guided breathing exercise. Another mindfulness exercise––the senses exercise––is also introduced.
  • Feeling tools: self-care; mindfulness
Breathing exercise and activity scheduling
Session 9: Putting It All TogetherAs the last session of the program, Session 9 reviews the core techniques taught and addresses relapse prevention. The user follows along to create their empowerment guide. The user also learns about other treatment options like medication, one-on-one therapy, and other therapies.
  • Program review and relapse prevention
Relapse prevention plan

aThe list of sessions and content presented in this table is the most updated version and is being tested in an ongoing pilot randomized controlled trial.

bCBT: cognitive behavioral therapy.

cBA: behavioral activation.

The web-based sessions are enriched with an animated case story series featuring a 74-year-old homebound woman named Jackie (Multimedia Appendix 2). The animated story series is embedded within each session, similar to a television show episode, to reinforce and further illustrate the application of core CBT skills and techniques. The inclusion of an animated case story aligns with persuasive design and uses entertainment to elicit strong emotional responses [19].

The web-based program is accompanied by a printed user workbook in large print, containing session summaries, in-session exercises, directions and forms for home practices, inspirational quotes, and wellness resources (Multimedia Appendix 3). Typically, iCBT programs offer web-based worksheets or workbooks. However, we provided a printed workbook considering the target population’s likely familiarity with print media and the commonly reported issues regarding text entry from other iCBT studies involving older adults [20].

Using agile, state-of-the-art development processes, we built the web platform as a custom learning management system and made it accessible across various devices. The main interface features large buttons, icons with text descriptions, high-contrast color schemes, and intuitive navigation, all of which follow the current best practices for creating age-friendly UIs [16]. In addition, we designed a provider dashboard that allows providers to review patients’ progress, enabling them to readily access easily digestible data for quality improvement and evaluation purposes (Multimedia Appendix 4).

Design Overview

The intervention development followed a user-centered design process and involved 3 iterative steps: elicitation, design, and usability testing, as outlined by Kruzan et al [21]. In the elicitation phase, we drew on multiple data sources to inform our understanding of end user needs, preferences, and requirements.

In the design phase, we used a co-design approach to create the treatment manual, case stories, and media design in collaboration with community stakeholders. These stakeholders included older adult advisors, geriatric mental health professionals, and aging services providers from 14 community organizations in Michigan. Over 100-plus meetings, the team iteratively refined the scripts of web-based sessions, voice-over actors, animated character designs, media designs, workbook designs, and other program elements based on stakeholder input. The core research team worked with a user experience designer to develop a wireframe, followed by a low-fidelity prototype. This prototype was tested by researchers and older adult stakeholders and refined based on their feedback.

In the usability testing phase, the ready-to-release version of Empower@Home underwent a heuristic evaluation by researchers and User Experience designers, followed by an in-home comparative usability study involving think-aloud exercises and longitudinal field testing with end users. Design needs and refinements to both the treatment manual and the web interfaces were made after each evaluation.

We assembled a multidisciplinary team to support our design activities, including mental health researchers, gerontologists, human-computer interaction researchers, user experience designers, web developers, and community stakeholders. We used recommended eHealth development strategies, such as heavy stakeholder participation, an iterative design process, continuous evaluations, and integration of implementation issues and concerns into the design process [22].


Multiple data sources informed our understanding of the needs, preferences, and requirements of end users and the community settings that are likely to implement the intervention. First, we reanalyzed qualitative data from 21 homebound older adults who had participated in our prior study on a generic iCBT program. We applied a deductive coding approach guided by the efficiency model of support [23] to identify issues surrounding usability (ease of use), engagement (motivation), fit (meeting user’s needs), knowledge (how to use a tool), and implementation (how to apply the tool into user’s life). The procedures of the prior study are detailed elsewhere [17]. We edited the transcripts, preserving only text relevant to our research question. A priori codes were developed based on the efficiency model of support and codes from our prior work. New codes were inductively added as we analyzed the transcripts. After the codes were finalized, we then identified patterns and categorized the initial codes into a smaller number of groups, themes, and concepts.

Our second data source was summary statistics of participants in Michigan’s 1915(c) MI Choice Waiver program, generously shared by colleagues. This program caters to our target demographic of low-income homebound older adults.

In addition, we established partnerships with social service agencies that serve many homebound older adults. We conducted semistructured interviews and a web-based survey with these organizations to identify potential barriers to implementing DMHIs at both the provider and organization levels. Each organization was profiled, and common barriers to implementation were identified through a descriptive analysis of the survey data and a text analysis of the qualitative data.


The design process primarily used co-design meetings and passive storyboarding techniques. Given that CBT is considered gold standard psychotherapy for depression, the core session elements were informed by widely used, evidence-supported CBT manuals [24-27]. In our regular meetings with community stakeholders, we shared each session’s content by reading it aloud, thus soliciting immediate feedback and stimulating discussions on potential improvements to both content and delivery. Revisions, guided by meeting notes, were typically integrated within a day or two. Frequent meetings were held with stakeholders from various organizations each week. The revised content would then be tested in subsequent meetings with different stakeholder groups, an iterative process that continued until the core development team was satisfied that all feedback had been addressed. This labor-intensive co-design process served to not only craft the intervention but also foster partnerships with community organizations. Given that these activities unfolded amidst the COVID-19 pandemic, individual co-design meetings over Zoom (Zoom Video Communications, Inc) were deemed the most viable way to garner feedback from each stakeholder group.

While social service providers, many of whom are social workers, significantly influenced the psychoeducational content of the program, the development of the character-driven animated story heavily relied on input from homebound older adults. These senior advisors contributed to weekly small group discussions, guiding the development of characters, plot, script, and visual design. Each story episode was role-played during these meetings to obtain feedback on tone, dialogue, and alignment with the educational objectives of the sessions. Following each meeting, the core development team convened for a debrief, and revisions were promptly implemented. This iterative process of presenting revised scripts and design elements continued until no further feedback was forthcoming.

Usability Testing

In-Home Comparative Usability Evaluation

In a single in-home session, we conducted a comparative usability evaluation of 3 different DMHI programs, including Beating the Blues, MoodGym, and Empower@Home. Ten homebound older adults were recruited through UMHealthResearch, a volunteer registry maintained by the University of Michigan. Participants were eligible if they were at least 60 years old and homebound. Homebound status was broadly defined as self-reported difficulty with outdoor mobility or receipt of in-home care or home-delivered meals. Prior computer experience was not required. A diagnosis of depression or elevated depressive symptoms was not required.

Beating the Blues was selected due to a solid body of evidence supporting its effectiveness [28,29] and core features shared with Empower@Home, including using a mix of audio and video-based content to teach CBT techniques, case examples, in-session exercises, and homework practice. Unlike Empower@Home, Beating the Blues uses real actors in its case examples. MoodGym, which has no audio or video material, was also included in the evaluation. MoodGym is a popular iCBT program and has been shown to reduce depressive symptoms [30]. Of the 3 programs, Empower@Home was the only program specifically designed for older adults.

The order of trials was based on a predetermined random sequence to reduce the influence of learning effects. Participants spent up to 20 minutes per program and were asked to think aloud as they performed tasks such as reviewing information on the web pages, advancing to the next page, navigating between program components, and completing in-session exercises. After each trial, participants completed the System Usability Scale (SUS), followed by open-ended questions to probe their experiences, likes, and dislikes. After participants tried all 3 programs, they were asked to select their favorite program and explain their choice.

The SUS is a 10-item scale commonly used in evaluating the usability of websites, software, and other human-machine systems [31]. Scoring the SUS involves reverse coding the negatively worded statements and summing up all 10 item scores. The sum was then multiplied by 2.5 so that the total SUS score ranged from 0 (very poor perceived usability) to 100 (excellent perceived usability) in 2.5-point increments. A SUS score above 68 is considered above average. The SUS is a valid measure to compare systems [32] and has excellent internal consistency in the study sample (Cronbach α=.92).

Data were collected at each participant’s home, and all participants engaged with the 3 programs on a 10.5-inch tablet provided by the study team. Participants’ interactions with the screen were recorded using a screen recording app. One researcher with user experience design training took detailed notes of participants’ interactions with each program.

Descriptive statistics were conducted to describe the study sample and the SUS scores. Paired sample 2-tailed t tests were used to compare the SUS scores of Empower@Home and those of the 2 comparator programs. The user experience designer coded the responses to open-ended questions, field notes, and observations and generated a 1-page report, which aided in the interpretation of the SUS scores and their differences.

Longitudinal Field Testing

We conducted longitudinal field testing with low-income homebound older adults through a small open-pilot trial. Participants in the field test were recruited via community partner agency referrals. Participants needed to (1) read and speak English, (2) be at least 60 years old, and (3) have at least mild depressive symptoms at screening (≥5 on the PHQ-9) [18]. Individuals were ineligible if they had (1) probable dementia based on the Blessed Orientation, Memory, and Concentration test (score>9) [33]; (2) elevated suicide risk based on the Columbia-Suicide Severity Rating Scale [34]; (3) a terminal illness or unstable physical health conditions; or (4) severe vision impairment. Device ownership, prior computer use, or internet access were not required.

A 10.5-inch tablet with cellular data was provided to participants without technology access. Participants were given 10 weeks to complete the program with minimal support from project staff in the form of a brief weekly check-in that typically lasted between 5 and 10 minutes. Participants were invited to complete a short survey before the start of the program and then again at the end of the 10-week trial. Participants also completed up to 5 in-app assessments based on the PHQ-9.

Descriptive statistics were conducted to describe the study sample. Given the small sample size, inferential statistics were not computed. Instead, a single-subject line graph was used to visualize the changes in the PHQ-9 scores over time. Furthermore, participants’ feedback and field notes were coded and consolidated to uncover any additional usability concerns.

Ethics Approval

The University of Michigan Institutional Review Board approved the usability study and the field-testing study (HUM00207612). Written informed consent was obtained at the start of the home visit for the usability study. Verbal informed consent was obtained from each participant before the start of the program for the field-testing study.


Secondary data analysis of qualitative data from our previous project revealed various user needs and potential failure points to address. The key problems identified were usability issues that frustrated older participants. These included hard-to-read text, small clickable areas, perplexing navigation pathways, complex menu options, excessive information, and difficulties with text entry. One particular feature of the DMHI evaluated in our previous study demanded users input text. This aspect proved to be challenging for most participants. The contributing factors included a small text entry field, minimized text size when entered, unfamiliarity with on-screen keyboards, lower literacy levels, and potential website bugs, such as misleading error messages indicating omitted entries when users had filled them in. Underscoring the importance of design that accounts for age, the same program was highly regarded as “straightforward” and “easy-to-use” when tested by research assistants, most of whom were in their 20s.

In terms of engagement, participants appreciated the characters and their stories, the digestible module format, and the in-session exercises throughout the sessions. However, some activities, especially those requiring considerable cognitive flexibility, posed a challenge for them. Regarding issues surrounding fit, some participants noted that it occasionally used complex or sophisticated language that was difficult to comprehend. A primary concern was the program’s lack of age-appropriate stories and case examples, leading to a perception that it was “not for someone like me.” Another recurrent complaint was the excessive length of some sessions. Despite being broken down into shorter segments or pages, these sessions sometimes required 2 to 3 hours or more, imposing a considerable burden on the users. In terms of knowledge, external support could enhance comprehension of session content. Finally, concerning implementation, participants welcomed the opportunity to apply the skills learned beyond the web-based sessions.

Based on the summary statistics of participants in Michigan’s 1915(c) MI Choice Waiver program, the typical profile of a low-income, homebound older adult is as follows: female (68%), White (75%), living alone (34%), aged between 65 and 79 years (36%), and experiencing diabetes (39%), and pain-related issues (44%).

Additional insights for the design were gleaned from semistructured interviews (n=14) and the web-based survey (n=17) conducted with social service agencies. At the client level, the primary barriers to DMHI implementation identified by social service providers were limited access to technology (n=17, 100%), low technology literacy (n=16, 94%), the stigma associated with mental illness (n=12, 71%), and cost constraints (n=10, 59%). Provider-level barriers included limited knowledge of geriatric depression, high caseloads, and competing demands. At the organizational level, potential barriers included a lack of financial incentives, reimbursement restrictions, and staff shortages.


The design of Empower@Home, based on insights from our elicitation phase, addressed the identified failure point with various features. We created a streamlined UI to enhance usability, incorporating intuitive navigation, clear call-to-action prompts (eg, “Press NEXT”), and large buttons, text, and print. Our design avoids complex menu options and information overload on any page, focusing on a responsive web layout where each page fits within a single screen of a 10-inch tablet or larger, thus eliminating the need for scrolling. Most exercises are implemented via a printed workbook, making on-screen text entry optional. The sessions are brief (20-25 min) and divided into short videos and occasional voice-over instructional pages. A video tutorial to familiarize users with the system and on-demand technical support is also available.

For engagement, our program uses video-based learning featuring diverse older adults and an overarching character-driven narrative featuring a homebound older adult named Jackie. The inclusion of in-session exercises throughout the sessions aims to maintain user involvement.

Regarding fit, we used plain language, age-appropriate case stories, and examples. Case stories and additional workbook information support knowledge acquisition. Finally, weekly home practices and modeling behavioral changes in the character-driven story are designed to aid the implementation of the CBT tools.

To further enhance our program, we introduced “Empower Coaches,” laypersons trained to provide weekly support calls to users, thereby addressing potential technical difficulties. This addition stemmed from user preferences for real-time support from a human over a fully automated system. Such external support is vital for populations with lower educational attainment and health or technology literacy, as it can bolster knowledge and implementation. While clinicians or therapists could fulfill the coaching role, we opted for laypersons or agency staff without specialized mental health training, such as caseworkers or community health workers. This decision considered the shortage of mental health professionals and the staffing structure of social service agencies serving older adults identified through our elicitation phase.

Stakeholder input influenced every aspect of the program design. To illustrate, input from older adult advisors informed the selection of the narrator’s voice, with a preference for lower-pitched voices with neutral American accents and slightly slower pacing. We also avoided using background music during voice-over narrations to prevent comprehension difficulties for those with age-related hearing loss [35]. Additionally, 1 group of social service providers identified the lack of diversity among inspirational quotes, leading to a more diverse selection in our program.

Jackie, the central character in the animated story series, was modeled on the typical profile of the Michigan’s MI Choice Waiver participants. Jackie is portrayed as a 74-year-old White female living by herself, similar to the typical participant profile. She also shares their health challenges, specifically diabetes and arthritis-related pain. The decision to animate the Jackie story was informed by small group discussions, in which older adult advisors unanimously preferred animated story series over those performed by actors. Animation also provided an opportunity to incorporate visual storytelling elements that deepened the Jackie narrative without overrelying on lengthy narration or dialogue. When the visual design of characters was presented to stakeholders for feedback, a strong preference was shown for designs that did not rely on stereotyped representations of older people as disabled or frail. The initial character designs were revised based on additional stakeholder feedback. Multimedia Appendix 2 shows example video frames from various episodes of the Jackie story.

Development Cost

Excluding research staff time, replicators can expect a platform development cost near US $20,000 and monthly maintenance costs of around US $100. Our initial intervention development cost was US $10,125, supplemented by US $8700 for iterative refinements, totaling US $18,825. Regular upkeep, including server hosting with 2 central processing units, 4 GB RAM, 50 GB storage, and automated backups costs US $37 per month, plus US $50 per month for a dedicated database that is compliant with the Health Insurance Portability and Accountability Act. The University of Michigan Information and Technology Services provides the hosting service.

Additional expenses to consider are content creation costs, which can significantly differ based on creative requirements and chosen vendors. In our case, the animated story series “Jackie” cost US $32,900 to produce. The storyline, crafted by a freelance writer with a master of social work degree, incurred a cost of US $3000. In addition, voice-over recordings, performed by Fiverr-sourced artists who took on the roles of narrator, mindfulness exercise guide, and characters from the animated story series, added US $5000 to our expenses, making the total cost for the animated story series US $40,900.

Usability Testing

In-Home Comparative Usability Evaluation

The in-home visits lasted between 90 and 120 minutes. Table 2 shows descriptive statistics of the 10 participants from the in-home usability evaluation. They were aged 71.4 years on average, and primarily identified as female (n=6, 60%). In total, 8 had at least a college degree (80%), and 5 (50%) had a household annual income of over US $50,000. They all owned a laptop or a computer and had internet access at home. All agreed or strongly agreed that they felt confident working on computers.

Table 2. Descriptive statistics of the in-home comparative usability study participants (N=10).
Age (years), mean (SD)71.4 (6.4)
Gender, n (%)

Male4 (40)

Female6 (60)
Race or ethnicity, n (%)

Non-Hispanic White9 (90)

Hispanic1 (10)
Education, n (%)

Some college, no degree2 (20)

Bachelor’s degree3 (30)

Graduate degree5 (50)
Marital status, n (%)

Married or partnered3 (30)

Divorced or separated3 (30)

Widowed3 (30)

Never married1 (10)
Household income (US $), n (%)

$10,000-$20,0001 (10)

$20,001-$30,0001 (10)

$30,001-$40,0001 (10)

$40,001-$50,0002 (20)

$50,001+5 (50)
Regularly used devices, n (%)

Tablet or iPad8 (80)

Laptop or computer 10 (100)

Smartphone9 (90)

Table 3 shows the SUS scores by program tested. Normality tests were conducted to check the distribution of SUS scores, using the Stata commands swilk (the Shapiro-Wilk test for normality) and sktest (for skewness and kurtosis). All tests resulted in P values exceeding .05. These results failed to reject the null hypothesis, suggesting that the SUS scores followed a normal distribution. The average SUS score was 78 for Empower@Home, 55.8 for Beating the Blues, and 57.5 for MoodGym. SUS scores for Empower@Home had the smallest range and SD (score 78, SD 7.4), suggesting consistency in perceived usability across participants. In contrast, the SUS scores for Beating the Blues (55.8, SD 24.4) and MoodGym (57.5, SD 20.1) had large ranges and SDs. Results from paired 2-tailed t tests showed a significantly higher SUS score for Empower@Home compared to Beating the Blues and MoodGym, suggesting the superior perceived usability of Empower@Home over the 2 comparable programs.

Table 3. Usability statistics by programs tested during in-home evaluation.
Program testedSystem Usability Scalea score

Mean (SD)Range (min-max)Paired 2-tailed t testb (df)P(T>t) values
Empower@Home78.0 (7.4)65-87.5ComparatorComparator
Beating the Blues55.8 (24.4)2.5-87.5t=3.28 (9).005
MoodGym57.5 (20.1)22.5-90t=2.78 (9).011

aHigher total score indicates better usability.

bPaired 2-tailed t test compared the total SUS scores between Beating the Blues and Empower@Home, and between MoodGym and Empower@Home. Applying the Bonferroni correction, a 1-tailed P value of <.025 is statistically significant.

Of the 10 participants, 80% (n=8) preferred Empower@Home, reporting that they liked the mix of audio, video, and visual materials, and reported it being easy to use and engaging. Half of the users liked the narration and the animated story and said the story felt “real.” Most users felt that the look and feel of Empower@Home was neutral and had a clear layout. Two participants preferred Beating the Blues, liking its pacing, use of real actors, case examples, and in-session exercises. However, most participants, including the 2 who preferred Beating the Blues, reported that it was difficult to navigate its homepage, densely presented information, long loading time, and distracting “Urgent Support” button. Most users preferred a mix of audio, video, and visual materials, which were present in both Empower@Home and Beating the Blues. Most participants did not favor MoodGym for being text-heavy and having poor readability (small font and occasionally confusing terms or jargon). One participant shared a positive impression of MoodGym and liked it because they loved reading (however, this participant chose Empower@Home as their favorite).

Usability problems were found, particularly with the touch registration of the “Back” and “Next” buttons. The buttons, created with HTML’s <div> tag, function like hyperlinks requiring close pressing to the text. This issue is accentuated in older adults unfamiliar with touchscreens, who often apply long, hard presses, which capacitive touchscreens might not recognize. To address this issue, we replaced the <div> with the <button> tag to create an actual button that allows the entire button area to be clickable. Second, we implemented a dual color scheme to signal when a click command is registered. Third, we provided tips on interacting with a touchscreen in a short navigation tutorial played at the beginning of the program. Finally, we offered a stylus pen to participants to reduce problems caused by dry fingertips.

Longitudinal Field Testing

Four participants provided posttest data for the longitudinal field testing. They were all low-income homebound older adults enrolled in the Medicaid MI Choice Waiver program. Table 4 shows descriptive statistics of the participants. None of the 4 participants had a 4-year college degree. Two used the program on their own devices, and the other 2 used a 10.5-inch tablet provided by the study team. All participants had elevated depressive symptoms on the PHQ-9 before the start of the program (mean 12.75, SD 3.6).

Table 4. Descriptive statistics of the longitudinal field testing (n=4).

Age (years), mean (SD)64.3 (3.4)
Gender, n (%)

Male1 (25)

Female3 (75)
Race or ethnicity, n (%)

Non-Hispanic White4 (100)

Hispanic0 (0)
Education, n (%)

High school2 (50)

Some college, no degree2 (50)
Marital status, n (%)

Married or partnered2 (50)

Divorced or separated1 (25)

Never married1 (25)
Household income (US $), n (%)

$10,000-$20,0002 (50)

$20,001-$30,0002 (50)
Device ownership, n (%)

Tablet, iPad, laptop, or computer2 (50)

No device ownership2 (50)
Pretreatment PHQ-9a score, mean (SD)12.75 (3.6)

aPHQ-9: Patient Health Questionnaire-9.

At the end of the 10 weeks, 3 participants completed all 9 sessions, and 1 completed 8 sessions, suggesting excellent adherence rates. All participants agreed or strongly agreed that they liked the procedures used in this program and felt confident in their ability to perform the tasks required to participate in this program. The single-subject line graph (Figure 1) shows an overall trend of decreasing PHQ-9 scores over time.

Figure 1. Single-subject line graph showing Patient Health Questionnaire-9 scores from in-app assessments.

Principal Findings

Through a collaborative design process involving various stakeholders, we developed a DMHI incorporating CBT principles, age-related themes, engaging content, and an accessible UI. The in-home comparative usability evaluation results suggested that Empower@Home had higher perceived usability than Beating the Blues and MoodGym, 2 established iCBT programs. Most participants preferred Empower@Home over the other programs, citing its engaging multimedia content, clear layout, and relatable animated story. The longitudinal field testing results showed that low-income homebound older adults could adhere to the program with minimal support, suggesting the potential feasibility of the intervention.

Although the benefits of involving stakeholders in designing eHealth interventions, such as enhanced acceptability and engagement, are well discussed and acknowledged [36], members of some social groups continue to be excluded from full participation in the digital health ecosystem [37]. One such group is homebound older adults, who experience multiple social vulnerabilities and have limited technology literacy. Working with older adults with varying needs and technology literacy levels, we identified and addressed potential usability issues, such as touchscreen navigation difficulties, by refining the design and providing additional support and guidance to users. The program’s character-driven story was also developed in close collaboration with older adult advisors and drawing on the profile of low-income homebound older adults to ensure that the central character, Jackie, was representative of the population likely to receive the program as part of routine practice.

The COVID-19 pandemic has accelerated the use of web-based applications across multiple areas of health care, including mental health services [38]. The trend toward using DMHIs as part of routine care for those seeking treatment for mental health concerns is expected to continue. DMHIs that are cost-effective, scalable, ecologically responsive, and tailored have the potential to significantly expand treatment access, improve treatment outcomes, and support equity in mental health care. Researchers and clinicians developing DMHIs can learn from our experiences, which included close collaboration with community agencies and care recipients, an iterative design process, and close attention to user experience.


Although our results are promising, there are some limitations to consider. First, participants of the in-home usability evaluation were predominately non-Hispanic White and college-educated, which may not represent those likely to receive Empower@Home as part of routine practice. This may have resulted in overlooking UI challenges faced by other groups. Additionally, the in-home comparative usability evaluation involved a single session with each program, which may not fully capture the user experience throughout the intervention. Furthermore, although our longitudinal field testing was conducted with chronically ill, low-income homebound older adults, the sample size was small and lacked diversity. As our development process continues, we will continue to integrate feedback from participants from more diverse backgrounds and determine the efficacy of the intervention [39,40]. While we proactively considered implementation challenges during the design phase, future studies should systematically investigate implementation. Issues like coaching training and fidelity require further exploration to ensure the intervention is delivered as intended.


In conclusion, the development of Empower@Home provides a valuable example of how DMHIs can be designed and developed through close collaboration with stakeholders, iterative design processes, and attention to user experience. DMHIs have the potential to significantly expand access to mental health care and improve treatment outcomes, and future research should continue to explore the efficacy and implementation of these interventions.


The Michigan Health Endowment Fund funded the development project. Many individuals and entities not listed as coauthors contributed to the design and development of Empower@Home. The following user experience designers contributed to web and video design: Jing Xie, Crystal Huang, and Cindy Thai. Yu Wang designed the wireframe of the application. Ashima Kaura designed the user workbook. Wenzhao Zhang designed the program logo and favicon. Devika Joglekar created the animated videos of Jackie’s story. The following computer science students contributed to web improvement: Ethan Yeager, Duy Nguyen, and Sangil Lee. We thank the >30 social service providers from community agencies for their time and input, including A&D Home Health Care, Inc; Area Agency on Aging of Northwest Michigan; Northeast Michigan Community Service Agency, Inc; Northern Lakes Community Mental Health Authority or Northern Health Care Management; The Senior Alliance, Inc; The Information Center; Reliance Community Care Partners; Region 2 Area Agency on Aging; Region IV Area Agency on Aging; Region VII Area Agency on Aging; Senior Resources; Senior Services of Southwest Michigan; Upper Peninsula Commission for Area Progress; and Valley Area Agency on Aging. In addition, we thank Elizabeth Gallagher and her team at the Home and Community-Based Services Section at the Michigan Department of Health and Human Services for attending partnership meetings and advising on reimbursement and strategies to implement and continue the program. Special thanks go to the older adult advisors and other community stakeholders who participated in the co-design process and the volunteers in the usability testing. Finally, Julie Bynum, MD, MPH, and her team provided the summary statistics of MI Choice Waiver participants.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Empower@Home session content example web pages. From left to right: video, text with voice-over, and mood self-check score page.

DOCX File , 502 KB

Multimedia Appendix 2

Example video frames from the animated story of Jackie.

DOCX File , 1371 KB

Multimedia Appendix 3

Empower@Home user workbook example pages. From left to right: session summary, in-session exercise, home practice, and inspirational quote.

DOCX File , 1819 KB

Multimedia Appendix 4

Empower@Home user interface example pages. From left to right: program homepage, video page, and provider dashboard.

DOCX File , 371 KB

  1. Xiang X, Leggett A, Himle JA, Kales HC. Major depression and subthreshold depression among older adults receiving home care. Am J Geriatr Psychiatry. 2018;26(9):939-949. [FREE Full text] [CrossRef] [Medline]
  2. Bruce ML, McAvay GJ, Raue PJ, Brown EL, Meyers BS, Keohane DJ, et al. Major depression in elderly home health care patients. Am J Psychiatry. 2002;159(8):1367-1374. [FREE Full text] [CrossRef] [Medline]
  3. Woody CA, Ferrari AJ, Siskind DJ, Whiteford HA, Harris MG. A systematic review and meta-regression of the prevalence and incidence of perinatal depression. J Affect Disord. 2017;219:86-92. [FREE Full text] [CrossRef] [Medline]
  4. Schulz R, Beach SR, Ives DG, Martire LM, Ariyo AA, Kop WJ. Association between depression and mortality in older adults: the cardiovascular health study. Arch Intern Med. 2000;160(12):1761-1768. [FREE Full text] [CrossRef] [Medline]
  5. Brenes GA, Danhauer SC, Lyles MF, Hogan PE, Miller ME. Barriers to mental health treatment in rural older adults. Am J Geriatr Psychiatry. 2015;23(11):1172-1178. [FREE Full text] [CrossRef] [Medline]
  6. Andersson G, Cuijpers P, Carlbring P, Riper H, Hedman E. Guided internet-based vs. face-to-face cognitive behavior therapy for psychiatric and somatic disorders: a systematic review and meta-analysis. World Psychiatry. 2014;13(3):288-295. [FREE Full text] [CrossRef] [Medline]
  7. Muñoz RF. Using evidence-based internet interventions to reduce health disparities worldwide. J Med Internet Res. 2010;12(5):e60. [FREE Full text] [CrossRef] [Medline]
  8. Luo C, Sanger N, Singhal N, Pattrick K, Shams I, Shahid H, et al. A comparison of electronically-delivered and face to face cognitive behavioural therapies in depressive disorders: a systematic review and meta-analysis. EClinicalMedicine. 2020;24:100442. [FREE Full text] [CrossRef] [Medline]
  9. Read J, Sharpe L, Burton AL, Arean PA, Raue PJ, McDonald S, et al. A randomized controlled trial of internet-delivered cognitive behaviour therapy to prevent the development of depressive disorders in older adults with multimorbidity. J Affect Disord. 2020;264:464-473. [FREE Full text] [CrossRef] [Medline]
  10. Xiang X, Sun Y, Smith S, Lai PHL, Himle J. Internet-based cognitive behavioral therapy for depression: a feasibility study for home care older adults. Res Soc Work Pract. 2020;30(7):791-801. [CrossRef]
  11. Dear BF, Zou J, Titov N, Lorian C, Johnston L, Spence J, et al. Internet-delivered cognitive behavioural therapy for depression: a feasibility open trial for older adults. Aust N Z J Psychiatry. 2013;47(2):169-176. [CrossRef] [Medline]
  12. Staples LG, Fogliati VJ, Dear BF, Nielssen O, Titov N. Internet-delivered treatment for older adults with anxiety and depression: implementation of the wellbeing plus course in routine clinical care and comparison with research trial outcomes. BJPsych Open. 2016;2(5):307-313. [FREE Full text] [CrossRef] [Medline]
  13. Titov N, Dear BF, Ali S, Zou JB, Lorian CN, Johnston L, et al. Clinical and cost-effectiveness of therapist-guided internet-delivered cognitive behavior therapy for older adults with symptoms of depression: a randomized controlled trial. Behav Ther. 2015;46(2):193-205. [CrossRef] [Medline]
  14. Tomasino KN, Lattie EG, Ho J, Palac HL, Kaiser SM, Mohr DC. Harnessing peer support in an online intervention for older adults with depression. Am J Geriatr Psychiatry. 2017;25(10):1109-1119. [FREE Full text] [CrossRef] [Medline]
  15. Koder DA, Brodaty H, Anstey KJ. Cognitive therapy for depression in the elderly. Int J Geriat Psychiatry. 1996;11(2):97-107. [CrossRef]
  16. Johnson J, Finn K. Designing User Interfaces for an Aging Population: Towards Universal Design. Burlington, MA. Morgan Kaufmann; 2017.
  17. Xiang X, Kayser J, Sun Y, Himle J. Internet-based psychotherapy intervention for depression among older adults receiving home care: qualitative study of participants' experiences. JMIR Aging. 2021;4(4):e27630. [FREE Full text] [CrossRef] [Medline]
  18. Kroenke K, Spitzer RL, Williams JB. The PHQ-9: validity of a brief depression severity measure. J Gen Intern Med. 2001;16(9):606-613. [FREE Full text] [CrossRef] [Medline]
  19. Lehto T. Designing persuasive health behavior change interventions. In: Wickramasinghe N, Bali R, Suomi R, Kirn S, editors. Critical Issues Development Sustainable e-health Solutions. New York, NY. Springer; 2012;163-181.
  20. Chen AT, Slattery K, Tomasino KN, Rubanovich CK, Bardsley LR, Mohr DC. Challenges and benefits of an internet-based intervention with a peer support component for older adults with depression: qualitative analysis of textual data. J Med Internet Res. 2020;22(6):e17586. [FREE Full text] [CrossRef] [Medline]
  21. Kruzan KP, Meyerhoff J, Biernesser C, Goldstein T, Reddy M, Mohr DC. Centering lived experience in developing digital interventions for suicide and self-injurious behaviors: user-centered design approach. JMIR Ment Health. 2021;8(12):e31367. [FREE Full text] [CrossRef] [Medline]
  22. van Gemert-Pijnen JEWC, Nijland N, van Limburg M, Ossebaard HC, Kelders SM, Eysenbach G, et al. A holistic framework to improve the uptake and impact of eHealth technologies. J Med Internet Res. 2011;13(4):e111. [FREE Full text] [CrossRef] [Medline]
  23. Schueller SM, Tomasino KN, Mohr DC. Integrating human support into behavioral intervention technologies: the efficiency model of support. Clin Psychol Sci Pract. 2016;24(1):27-45. [CrossRef]
  24. Gallagher-Thompson D, Thompson LW. Treating Late Life Depression: A Cognitive-Behavioral Therapy Approach, Therapist Guide. USA. Oxford University Press; 2009.
  25. Laidlaw K, Thompson LW, Gallagher-Thompson D, Dick-Siskin L. Cognitive Behaviour Therapy with Older People. Hoboken, NJ. John Wiley & Sons; 2003.
  26. Lejuez CW, Hopko DR, Acierno R, Daughters SB, Pagoto SL. Ten year revision of the brief behavioral activation treatment for depression: revised treatment manual. Behav Modif. 2011;35(2):111-161. [CrossRef] [Medline]
  27. Williams JM, Teesdale J, Segal Z, Kabat-Zinn J. The Mindful Way Through Depression: Freeing Yourself from Chronic Unhappiness. New York, NY. Guilford Publications; 2007.
  28. Proudfoot J, Goldberg D, Mann A, Everitt B, Marks I, Gray JA. Computerized, interactive, multimedia cognitive-behavioural program for anxiety and depression in general practice. Psychol Med. 2003;33(2):217-227. [CrossRef] [Medline]
  29. Proudfoot J, Ryden C, Everitt B, Shapiro DA, Goldberg D, Mann A, et al. Clinical efficacy of computerised cognitive-behavioural therapy for anxiety and depression in primary care: randomised controlled trial. Br J Psychiatry. 2004;185:46-54. [FREE Full text] [CrossRef] [Medline]
  30. Twomey C, O'Reilly G. Effectiveness of a freely available computerised cognitive behavioural therapy programme (MoodGYM) for depression: meta-analysis. Aust N Z J Psychiatry. 2017;51(3):260-269. [CrossRef] [Medline]
  31. Lewis JR. The system usability scale: past, present, and future. Int J Hum-Comput Interact. 2017;51(3):260-269. [CrossRef]
  32. Peres SC, Pham T, Phillips R. Proceedings of the human factors and ergonomics society annual meeting. SAGE Publications; Presented at: Validation of the System Usability Scale (SUS): SUS in the Wild; September 2013, 2013;192-196; Los Angeles, CA. [CrossRef]
  33. Katzman R, Brown T, Fuld P, Peck A, Schechter R, Schimmel H. Validation of a short orientation-memory-concentration test of cognitive impairment. Am J Psychiatry. 1983;140(6):734-739. [CrossRef]
  34. Posner K, Brown GK, Stanley B, Brent DA, Yershova KV, Oquendo MA, et al. The Columbia-suicide severity rating scale: initial validity and internal consistency findings from three multisite studies with adolescents and adults. Am J Psychiatry. 2011;168(12):1266-1277. [FREE Full text] [CrossRef] [Medline]
  35. Oxenham AJ. Pitch perception and auditory stream segregation: implications for hearing loss and cochlear implants. Trends Amplif. 2008;12(4):316-331. [FREE Full text] [CrossRef] [Medline]
  36. Solem IKL, Varsi C, Eide H, Kristjansdottir OB, Børøsund E, Schreurs KMG, et al. A user-centered approach to an evidence-based electronic health pain management intervention for people with chronic pain: design and development of EPIO. J Med Internet Res. 2020;22(1):e15889. [FREE Full text] [CrossRef] [Medline]
  37. Lupton D. Digital health now and in the future: findings from a participatory design stakeholder workshop. Digit Health. 2017;3:2055207617740018. [FREE Full text] [CrossRef] [Medline]
  38. Zhong S, Yang X, Pan Z, Fan Y, Chen Y, Yu X, et al. The usability, feasibility, acceptability, and efficacy of digital mental health services in the COVID-19 pandemic: scoping review, systematic review, and meta-analysis. JMIR Public Health Surveill. 2023;9:e43730. [FREE Full text] [CrossRef] [Medline]
  39. Xiang X, Kayser J, Turner S, Zheng C. Layperson-supported internet-delivered cognitive behavioral therapy for depression among older adults. Res Soc Work Pract. 2023 [CrossRef]
  40. Kayser J, Wang X, Wu Z, Dimoji A, Xiang X. Layperson-facilitated internet-delivered cognitive behavioral therapy for homebound older adults with depression: protocol for a randomized controlled trial. JMIR Res Protoc. 2023;12:e44210. [FREE Full text] [CrossRef] [Medline]

CBT: cognitive behavioral therapy
DMHI: digital mental health intervention
iCBT: internet-based cognitive behavioral therapy
PHQ-9: Patient Health Questionnaire-9
SUS: System Usability Scale
UI: user interface

Edited by R Yang; submitted 30.03.23; peer-reviewed by N Titov, W Liang; comments to author 09.06.23; revised version received 05.07.23; accepted 17.08.23; published 19.09.23.


©Xiaoling Xiang, Jay Kayser, Samson Ash, Chuxuan Zheng, Yihang Sun, Addie Weaver, Ruth Dunkle, James A Blackburn, Alex Halavanau, Jia Xue, Joseph A Himle. Originally published in JMIR Aging (, 19.09.2023.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Aging, is properly cited. The complete bibliographic information, a link to the original publication on, as well as this copyright and license information must be included.