Published on in Vol 5, No 3 (2022): Jul-Sep

Preprints (earlier versions) of this paper are available at https://preprints.jmir.org/preprint/38130, first published .
Capturing Cognitive Aging in Vivo: Application of a Neuropsychological Framework for Emerging Digital Tools

Capturing Cognitive Aging in Vivo: Application of a Neuropsychological Framework for Emerging Digital Tools

Capturing Cognitive Aging in Vivo: Application of a Neuropsychological Framework for Emerging Digital Tools

Authors of this article:

Katherine Hackett1 Author Orcid Image ;   Tania Giovannetti1 Author Orcid Image

Viewpoint

Department of Psychology and Neuroscience, Temple University, Philadelphia, PA, United States

Corresponding Author:

Tania Giovannetti, PhD

Department of Psychology and Neuroscience

Temple University

1701 N 13th Street

Weiss Hall 6th Floor

Philadelphia, PA, 19122

United States

Phone: 1 215 204 4296

Email: tania.giovannetti@temple.edu


As the global burden of dementia continues to plague our healthcare systems, efficient, objective, and sensitive tools to detect neurodegenerative disease and capture meaningful changes in everyday cognition are increasingly needed. Emerging digital tools present a promising option to address many drawbacks of current approaches, with contexts of use that include early detection, risk stratification, prognosis, and outcome measurement. However, conceptual models to guide hypotheses and interpretation of results from digital tools are lacking and are needed to sort and organize the large amount of continuous data from a variety of sensors. In this viewpoint, we propose a neuropsychological framework for use alongside a key emerging approach—digital phenotyping. The Variability in Everyday Behavior (VIBE) model is rooted in established trends from the neuropsychology, neurology, rehabilitation psychology, cognitive neuroscience, and computer science literature and links patterns of intraindividual variability, cognitive abilities, and everyday functioning across clinical stages from healthy to dementia. Based on the VIBE model, we present testable hypotheses to guide the design and interpretation of digital phenotyping studies that capture everyday cognition in vivo. We conclude with methodological considerations and future directions regarding the application of the digital phenotyping approach to improve the efficiency, accessibility, accuracy, and ecological validity of cognitive assessment in older adults.

JMIR Aging 2022;5(3):e38130

doi:10.2196/38130

Keywords



The global burden of dementia, a clinical syndrome associated with cognitive deficits that impair everyday functioning, poses a tremendous and growing challenge to our healthcare system. As the worldwide population of older adults continues to increase and becomes more medically complex and diverse, the number of people that will develop Alzheimer’s disease and related dementias (ADRD) without current pharmacologic treatments to improve cognition and function [1] is expected to triple from 55 million in 2021 to over 139 million by 2050 [2]. Estimates of disability-adjusted life years (ie, sum of years lost due to premature mortality and years lived with disability) indicate that ADRD is extremely burdensome to individuals diagnosed, their families, and their caregivers, ranking among the top 10 most burdensome diseases in the United States [3]. Early diagnosis and intervention before neuronal degeneration and functional disability begin presents one promising route to meaningfully delay disease burden and promote aging in place [4-6]. From a health economics perspective, it is estimated that early detection of the prodromal, mild cognitive impairment (MCI) stage [7] could save $7.9 trillion in the United States alone [8]. Novel digital methods have great potential for efficient, accessible, reliable, and accurate assessment of early cognitive changes reflecting ADRD. However, to be most effective, digital tools should be informed by conceptual models that explain and predict early cognitive changes.

In this viewpoint, we focus on the application of digital phenotyping to assess age-related changes in functional cognition, with contexts of use that include early detection, risk stratification, prognosis, and outcome measurement. We begin by outlining current approaches to detecting pathological cognitive change along with their notable drawbacks. The digital phenotyping approach is introduced as a promising complementary method. We then present a neuropsychological framework of everyday cognitive and functional changes, termed the Variability in Everyday Behavior (VIBE) model, which can be used to inform studies and generate testable hypotheses in the context of digital phenotyping. Supporting literature that was integrated to develop the VIBE model is also summarized. We conclude with methodological considerations and future directions regarding the digital phenotyping approach.


Neurodegenerative pathology may be directly measured in the brain tissue and detected in cerebrospinal fluid (CSF) and blood [9]; biological measures are classified using biomarker-based diagnostic frameworks for ADRD [10,11]. Importantly, existing methods of biomarker testing are expensive, not widely available, and may be invasive depending on the methodology (eg, lumbar puncture). More concerning, however, is that biological indicators of neurodegenerative disease yield limited information on clinical outcomes such as progression, cognitive abilities, and everyday functioning [12,13]. For example, approximately 30% of individuals with substantial amyloid burden—a core Alzheimer’s disease (AD) biomarker—fail to show clinical symptoms of dementia, whereas up to 25% of individuals who meet clinical criteria for AD have no or limited amyloid burden [14]. The prioritization of biological outcomes is also concerning given that clinical outcomes such as cognitive and functional abilities are most predictive of quality of life, cost of care, and independence, which are precisely the outcomes that individuals diagnosed, their caregivers, healthcare professionals, and policy makers most value [15].

Neuropsychological assessment is less expensive and invasive compared to biomarker testing and is currently used for clinical staging, differential diagnosis, tracking change in cognitive functioning over time, and informing personalized recommendations. The neuropsychological measures that are used for clinical assessments have undergone extensive psychometric validation and are informed by cognitive neuroscience theories. At present, neuropsychological test results are a key component of clinical diagnostic criteria for dementia and MCI [10,16,17] and serve as a primary end point in most clinical trials [18]. In recent years, several neuropsychological tests and composite measures have shown promise in identifying very early, subtle changes that occur in presymptomatic disease stages [19-21].

Nevertheless, current assessment methods present methodological drawbacks, including lengthy and resource-intensive in-person testing sessions that are often inaccessible to individuals from underserved or rural communities, highly controlled testing environments that foster limited ecological validity and test anxiety, burdensome and error-prone scoring procedures, and limited data sharing infrastructures. Traditional assessments take place on a single occasion representing a one-time snapshot that may not reflect an individual’s typical range of performance or intervening contextual factors [22,23]. Even when repeat testing is performed, practice effects between sessions may obscure subtle but meaningful cognitive decline [24].

New mobile and computerized platforms with enhanced efficiency and sensitivity, such as repeat ambulatory cognitive assessments, address some of these methodological drawbacks [25] and have been examined in various studies among populations comprising healthy and community-based older adults [22,26], those with preclinical AD [27], and those with MCI or early dementia [28]. However, many of these methods continue to be (A) modeled after traditional tests that measure isolated cognitive domains with limited ecological validity; (B) susceptible to practice effects [29]; (C) influenced by socioeconomic status and cultural factors [30-32]; and (D) prone to challenges with adherence even among highly motivated and engaged individuals, particularly with longer study durations [27,33,34]. Thus, while tremendous advances have been made in the realm of digital cognitive assessment, existing methods continue to show limited generalizability to diverse populations and real-world settings, even when used at home outside of the clinic. The strengths and weaknesses of the current approaches are summarized in Table 1.

Table 1. Strengths and weaknesses of current approaches to detect pathological change.
ApproachesStrengthsWeaknesses
Biomarker testing
  • Objective measurement of disease presence in the body
  • Good sensitivity/early detection of target pathology
  • Ability to localize pathology
  • Ability to identify specific pathology
  • High cost
  • Limited accessibility
  • Potentially invasive (CSFa and blood biomarkers)
  • Limited correspondence with functional outcomes
  • Limited prognostic value
  • Interpretation can be subjective
Traditional neuropsychological assessment
  • Extensively validated and informed by cognitive neuroscience theories
  • Noninvasive
  • Measure discrete cognitive abilities
  • Inform personalized recommendations
  • Moderate correspondence with functional outcomes
  • Limited accessibility
  • Lengthy and error-prone administration and scoring procedures
  • Highly controlled environment and tasks/limited ecological validity
  • Limited sensitivity to early decline
  • Single time point without context
  • Practice effects at repeat administration
  • Influenced by socioeconomic and cultural factors
Mobile cognitive assessment
  • Brief administration
  • Improved accessibility
  • Potential for increased sensitivity
  • Noninvasive
  • Ability to assess cognition in everyday context and across multiple time points
  • Possible reduction in test anxiety
  • Challenges in adherence
  • Practice effects at repeat administration
  • Impact of hardware and software differences when personal devices are used
  • Continued impact of socioeconomic/cultural factors
  • Uncontrolled testing environment may lead to increased measurement error/noise

aCSF: cerebrospinal fluid.


Emerging digital tools lend a unique opportunity to address many of the drawbacks of traditional, computerized, and mobile cognitive testing. One such method is digital phenotyping, an innovative approach that utilizes the “moment-by-moment quantification of the individual-level human phenotype in situ” based on interactions with technology, including smartphones and smart home devices, to capture social and behavioral data passively, continuously, and with minimal interference [35-37]. Because most everyday tasks require the coordinated effort of multiple cognitive processes and are highly context dependent, digital phenotyping data collected in this passive manner may provide a more naturalistic, comprehensive, and nuanced understanding of behavior and cognition as compared to traditional active assessment methods that take place in the clinic, the lab, or during a discrete period of time. Contrary to standardized neuropsychological tasks that are highly related to educational quality [38] and other sociocultural factors [39], digital proxies of everyday behavior captured in someone’s natural environment may yield a less biased measure of cognition and function, particularly when methods rely on longitudinal monitoring of individual change. Furthermore, high frequency continuous data have the potential to improve sensitivity and reliability and reduce the sample size requirements needed to detect subtle differences between groups or among individuals over time [40].

Smartphones, which are ubiquitous, are equipped with a host of embedded sensors that are common across different devices and may be leveraged to passively assess everyday activities and behaviors. Preliminary studies have investigated smartphone-based digital biomarkers (via sensor and application use data) to measure specific behaviors and offer support and validation for call and text message logs [41] as well as call reciprocity [42] as measures of social patterns; WiFi/Bluetooth signals as a proxy for social engagement (time spent proximal to other people) [37]; GPS movement trajectories and keystroke data as measures of mood [43-45]; and accelerometer data to infer sleep patterns [46]. The validity of smartphone digital phenotyping has been demonstrated in mental health and neurological populations, with results supporting the predictive utility of a range of smartphone data for daily stress levels [47], changes in depression and loneliness [43,46,47], psychosis onset and relapse [48-50], suicide risk [47], speech changes [51], and biological rhythms [52].

Other studies have attempted to identify digital markers that reflect underlying cognitive abilities. A 2018 study of 27 healthy young adults [53] followed by a 2019 study of 84 healthy older adults [54] demonstrated significant associations between smartphone metrics (eg, number of apps used, usage by hour of day, swipes, and keystroke events) and performance on standard cognitive tests. Of note, these studies were exploratory in nature and lacked a priori hypotheses to guide analyses. A separate pilot study of adults with and without bipolar disorder examined performance on a digital trail making test and found associations between smartphone typing speed and typing speed variability and test performance, suggesting a possible link between executive functioning and keystroke measures [29]. In the context of MCI and dementia, a feasibility study employing multiple sensor streams and machine learning models identified 5 digital features that discriminated symptomatic (MCI, mild AD) from asymptomatic groups; these features included typing speed, regularity in behavior (via first and last phone use), number of received text messages, reliance on helper apps, and survey compliance [55]. As noted by the authors of the aforementioned pilot and feasibility studies, a major limitation was the small sample sizes, which limited interpretability.

Indeed, although preliminary studies have laid the groundwork for exploring relationships between passive digital variables and standard measures of cognition, the lack of integrative theoretical models to inform interpretation of large continuous datasets represents a major gap [23]. As digital tools and machine learning approaches become increasingly sophisticated, it is critical that theoretically sound models are developed to avoid scattershot analyses and spurious findings and to facilitate interpretability [56]. Furthermore, as technologies inevitably continue to evolve, the development of testable models that are agnostic to hardware and software differences is key to the continued validation of passive approaches [56-58]. Therefore, we propose a neuropsychological framework to guide studies using emerging digital tools to assess age-related cognitive and functional decline. The VIBE model integrates established findings regarding intraindividual variability, cognitive abilities, and everyday functioning in the context of aging and ADRD. Importantly, the VIBE model generates specific, testable hypotheses grounded in theory that may inform the design and interpretation of future digital phenotyping studies and represents a preliminary step toward establishing conceptual guidelines for the field.


The VIBE model resulted from an in-depth review of the neuropsychology, neurology, neuroscience, rehabilitation psychology, and computer science literature. Consistent findings in both performance level and intraindividual variability were identified across the spectrum of cognitive impairment and interpreted in the context of known patterns of cognitive change and their underlying mechanisms. The literature review was used to conceptualize changes in everyday behavior across the spectrum from healthy aging to ADRD and how these changes would be captured by digital phenotyping approaches. For example, the increased variability in standardized cognitive testing and everyday task performance in people with MCI is expected to result in meaningful variability in passive smartphone sensor data in digital phenotyping studies. Without a framework to guide analyses, aggregate data might be prioritized over meaningful variability, which could be misinterpreted as a nuisance (ie, “noise”). Therefore, the VIBE model integrates and extends existing findings to provide structure, guidance, and optimize digital phenotyping study designs.


Early stages of pathological aging (ie, MCI) are associated with mild isolated decrements on standardized cognitive tests, subtle difficulties with complex activities of daily living, and increased variability in both cognitive and functional measures. Later stages (ie, dementia) are characterized by greater cognitive and functional impairment, reduced activity and task accomplishment, and less variability in cognitive and functional performance. Table 2 provides a summary of these trends. Multimedia Appendix 1 contains a comprehensive review of the supporting literature [59-107].

Table 2. Summary of background literature supporting the Variability in Everyday Behavior (VIBE) framework.

Healthy agingEarly decline (MCIa)Later decline (dementia)
Cognitive ability
  • Subtle declines within normative limits
  • Impaired performance on 1+ domain according to normative scores
  • Impaired performance on multiple domains according to normative scores
Cognitive variability
  • Increased variability versus younger adults
  • Increased variability versus healthy older adults
  • Increased variability predicts further decline and poorer cognition
  • Less variability than MCI for complex tasks at floor
  • Increased variability than MCI for simple tasks
Everyday functioning
  • Subtle changes/inefficient behaviors (microerrors)
  • Fully independent

  • Difficulty with complex tasks
  • Independent with some compensatory strategy use
  • Inefficient (commission errors) and more variable than healthy older adults
  • Impaired for basic and complex tasks
  • Dependent
  • Outright failure to complete tasks (omission errors)

aMCI: mild cognitive impairment.

Theoretical models from computational science offer a useful framework for understanding changes in ability level and variability in the progression of pathological aging. The term “graceful degradation” is used to characterize the way in which complex systems maintain functionality in the face of mild damage or problematic changes in the environment [108]. From a neuropsychological perspective, increased inefficiency and variability in the early stage of decline may stem from faulty executive control mechanisms governed by the prefrontal cortex and associated white matter projections, which, according to a framework proposed by Giovannetti and colleagues [109], are essential to modulate goal activations, enable smooth transitions between goals, and inhibit inappropriate activations from internal or external distractors during everyday tasks. Reductions in extrastriatal dopaminergic neuromodulation required for consistent cognitive control in early stages of dementia support this framework [110-112]. Indeed, long-standing explanations for the link between inconsistency and neurologic disease include impaired neural networks, functional connectivity, and executive functioning [113-115]. An alternative framework from which to interpret early patterns of inefficiency and variability, particularly in the absence of executive function deficits, is the resource theory [116], which originates from the cognitive rehabilitation literature. This theory posits that early damage to any nonspecific brain region depletes overall cognitive resources and leads to errors in task performance and that the level—not the type—of cognitive impairment is critical in determining functioning [109]. As a result of mild resource depletion, compensatory strategies are engaged to allow the system to function, but at a cost (ie, inefficiently, slowly, and inconsistently). In moderate-to-severe stages, greater decrements are observed across multiple cognitive domains, basic activities of daily living are impaired, and patterns of variability are less clear because people are generally less active.

Considering this, we propose the Variability in Everyday Behavior (VIBE) model as a dual-pronged neuropsychological framework that integrates trends in variability (see Figure 1, blue dotted line showing a U-shaped pattern peaking at MCI) and declining ability level (see Figure 1, solid purple line showing a negative linear trend) that are observed across the cognitive aging spectrum. The VIBE model proposes a theoretical foundation from which to evaluate metrics of everyday behavior and cognition captured by the digital phenotyping approach, both in studies examining cross-sectional differences in individuals with different levels of cognitive impairment, and over time in individuals with progressive neurodegenerative disease in longitudinal designs. For example, decreasing cognitive abilities may be indexed by decreases in social activity [117,118], technology usage [119,120], positive mood (ie, increased depressive symptoms [121]), and range of movement/physical activity [122], which can all be inferred from passive sensor metrics. These activity metrics tend to remain stable in earlier stages and begin to decline more notably in the transition from MCI to dementia. A simultaneous examination of intraindividual variability within these metrics across a longitudinal study period may reflect increased day-to-day variability as early as the healthy to MCI transition stage, as individuals engage reserve mechanisms and compensatory strategies to combat mild difficulties and inefficiencies (eg, commission errors). On metrics/activities where dementia-level performance is at floor (eg, movement trajectories outside the home, text messaging, other complex activities where compensatory mechanisms have failed and task goals are no longer achieved; ie, omission errors), we expect variability to decrease below that which we observe in MCI (Figure 1, blue dotted line).

Figure 1. The Variability in Everyday Behavior (VIBE) model of intraindividual variability, cognitive abilities, and everyday functioning for pathological cognitive decline in older adults.
View this figure

The existing literature is less clear on patterns of variability in the transition from MCI to dementia [123], and we acknowledge the possibility that for relatively simple activities that individuals with mild dementia still perform (eg, movement trajectories within the home, incoming phone calls, sleep/wake cycle), variability may continue to increase in the mild dementia stage followed by eventual decline as abilities further decline. Thus, model predictions should be tested and interpreted with attention to task demands, as well as other contextual features, including the time of day [29], mood, and technology use habits. In other words, the progression from increased variability to decreased variability and complete failure to act depicted in Figure 1 is expected with increasing severity of impairment, though impairment level is determined by more than just clinical status. There may be some period—likely at the transition between MCI and dementia—where contextual factors (eg, task complexity, time of day, external distractors) interact with clinical status to influence level of impairment on metrics of everyday behavior. For example, a person with mild dementia may show marked impairment and decreased variability in financial tasks but may show only mild impairment and increased variability in meal preparation until later in the course of their illness when both tasks are impaired, and variability is diminished. Thus, task effects should be carefully considered, particularly at the boundary of MCI and dementia.


Digital phenotyping using personal smartphone devices represents a promising method to examine age-related changes in functional cognition according to our proposed framework. Study designs may take a variety of forms, but initial studies should include collection of clinically relevant validation measures and longitudinal monitoring. One potential protocol would involve comprehensive baseline assessment to gather gold-standard validation data on function, cognition, mood, and other participant features such as demographics, attitudes toward and experience with technology, and technology use habits that are likely to influence resulting digital data. A period of passive longitudinal monitoring using available, open-source digital phenotyping applications (eg, Beiwe [37], mindLAMP [124]) would involve collection of a host of sensor and application data, including the examples listed in Textbox 1.

The VIBE framework enables systematic selection and analysis of the mobility, sociability, and device activity features from Textbox 1 to obtain activity and variability metrics that could be tested according to a priori hypotheses. A list of nonexhaustive, sample hypotheses derived from the VIBE model that are appropriate for cross-sectional studies of older adults across the cognitive aging spectrum is included in Table 3.

Example digital phenotyping metrics across 3 feature domains.

Mobility

  • Time spent at home
  • Distance traveled
  • Radius of gyration
  • Maximum diameter
  • Maximum distance from home
  • Number of significant locations
  • Average flight length
  • Standard deviation of flight length
  • Average flight duration
  • Standard deviation of flight duration
  • Fraction of the day spent stationary
  • Significant location entropy
  • Minutes of GPS data missing
  • Physical circadian rhythm
  • Average sleep duration
  • Standard deviation of sleep duration

Sociability

  • Number of outgoing texts
  • Total outgoing text length
  • Number of incoming texts
  • Total incoming text length
  • Texting reciprocity
  • Texting responsiveness
  • Number of outgoing calls
  • Total outgoing call duration
  • Number of incoming calls
  • Total incoming call durations
  • Call reciprocity
  • Call responsiveness

Device activity

  • Average battery level
  • Total phone off/on events
  • Total charge initiations
  • Total screen on/off events
  • Total application launches
  • Application switches
  • Central processing unit (CPU) utilization
  • Time to view daily survey
  • Time to complete daily survey
  • Time of first/last screen on event
Textbox 1. Example digital phenotyping metrics across 3 feature domains.
Table 3. Sample hypotheses informed by the Variability in Everyday Behavior (VIBE) model.
Digital phenotyping feature domainTotal activity level metricsAcross-day variability metrics
MobilityAverage distance traveled from home will decline from healthy to MCIa to dementia.Variability in distance traveled from home will be highest in MCI versus healthy/dementia.
SociabilityAverage number of outgoing calls will decline from healthy to MCI to dementia.Variability in daily average outgoing text length will be highest in MCI versus healthy/dementia.
Device activity Average number of application launches will decline from healthy to MCI to dementia.Variability in daily number of screen on/off events will be greater in MCI versus healthy/dementia.
Time of day effectsAverage time of first phone use will decline from healthy (earlier) to MCI to dementia (later).Time of first phone use will be most variable in MCI versus healthy/dementia.

aMCI: mild cognitive impairment.


There are a host of important methodological factors that must be thoughtfully considered when conducting such studies, many of which remain unresolved. Cross-device compatibility is a concern that becomes relevant when participants use their own personal devices for data collection. Individual devices may differ in operating system, screen size, sensor sampling rates, and more. These device differences impact user interactions and the quality of data that is collected; they are also related to socioeconomic status and other important participant features and thus cannot be simply covaried in analyses. A single study-issued device may be provided to all participants to standardize data collection and ensure that individuals from underserved groups have an equal opportunity to participate in such studies. However, introducing new technology creates a deviation from participants’ routines, diminishing ecological validity and posing more demands on everyday functioning. Therefore, the personal versus study-provided device decision must be weighed according to the study population and specific aims [27,33]. Although there is a critical concern that studies employing personal digital devices will serve to widen existing health disparities, rates of smartphone ownership—particularly among diverse individuals—have skyrocketed in recent years to include a total of 85% of Americans as of 2021, up from just 35% in 2011 [125]. This rate is consistent across individuals who identify as White (85%), Black (83%), and Hispanic (85%) and is only slightly lower (76%) for individuals with a household annual income less than US $30,000. Therefore, although careful attention must be paid to ensure smartphone studies are equitable, accessible, and generalizable to all, the increased affordability of smartphones may alleviate this concern. Relatedly, recruitment efforts should ensure diverse representation within digital phenotyping studies to investigate the generalizability of these methods. Updates to hardware, software, and allowable permissions (ie, which sensors an app can collect) are occurring at increasingly frequent rates as technology evolves, presenting an additional challenge to the continued validation and generalizability of such approaches. Thus, a device- and operating system–agnostic theoretical model, such as the VIBE model, from which to continually evaluate new data is critically important.

The naturalistic and passive collection of data in a completely unstandardized fashion presents an additional challenge in making between-group comparisons [56], and it remains undetermined whether between-group differences in metrics of interest will emerge despite individual differences in everyday phone use. The most powerful insights from the digital phenotyping approach may be realized by monitoring intraindividual change over longer periods of time, which would require theoretically informed statistical models to make generalizable claims in n-of-1 trials [56]. Another open question relates to the various sampling rates that can be selected to collect raw data from phone sensors and applications, which should be considered in the context of the scientific question and device battery limitations. Although most software platforms include default settings for sensor sampling (eg, GPS sampled at 1 Hz when the phone is in motion, WiFi signals recorded every 5 minutes), they also allow for customization of sampling rates [37]. A variety of GPS sampling rates have been applied across several studies of primarily young adult participants [48,49], and statistical approaches for imputing missing mobility data have been developed [126]. However, limited studies have examined the incremental utility of increased sampling rates across sensors other than mobility for making predictions of interest. Older adult phone users may require less frequent sampling due to less activity, though this may result in a restricted range of variability and impact findings. Determining the minimum necessary sampling frequency for smartphone data is directly tied to feasibility and is critical to inform the design of future studies, as greater frequencies come with greater costs (ie, increasingly expensive sensors, decreased battery life, increased storage needs). This also applies to the optimal length of the data collection period and the study sample size, which may differ depending on the population of interest and the study design [120], and are not appropriately determined using traditional power calculation methods. Barnett and colleagues [127] recommend the use of generalized linear mixed models and change point detection methods to inform the sample size and study duration necessary to achieve adequate power in such studies.

Digital phenotyping studies may employ a combination of passive and active data streams. In active data collection, users are prompted to complete a standardized test or survey on their smartphones, which can be used to yield key contextual information to inform the interpretation of passive sensor data [23,37,128-130]. However, this type of engagement detracts from the unobtrusive, naturalistic nature of pure passive monitoring, and it is unclear which types of active data are most useful when attempting to infer cognition from passive digital data. These methodological questions around sampling frequency and active data collection have not yet been explored in a population of older adult phone users, whose usage patterns may differ and may require increased sampling frequencies or increased active data than younger adults to accurately infer clinically relevant information.

It is also important to establish the context of use of the digital phenotyping approach and determine whether it is best applied as a risk, diagnostic, monitoring, prognostic, or outcome measurement tool. Regulatory agencies like the US Food and Drug Administration and pharmaceutical companies have increasingly recognized the potential of digital devices as a source of “real-world data” and “real-world evidence,” with the capability to monitor health status and clinical response over time and yield new insights about long-term health outcomes in the real world, outside of traditional randomized controlled trials [131]. However, as thoughtfully outlined by O’Bryant and colleagues [9], there are many challenges associated with translating new biomarker discoveries from research domains to routine clinical settings. For this to occur, standardization of the underlying platforms and data frameworks is critical to help make these data more uniform, interoperable, reproducible, and actionable [124]. Questions of scalability, manufacturability, intellectual property law, and regulatory considerations, including inconsistent governance of entities conducting digital health research [132], should not be disregarded [9]. In particular, the point at which mobile digital phenotyping applications are considered “software as a medical device” is ambiguous in the face of rapidly evolving regulatory guidance [133]. Finally, and most importantly, privacy and security concerns must be addressed, and protections of confidentiality must be clearly and continuously communicated to users and participants. Deidentification using study identification numbers, industry-standard encryption methods, storage of encrypted data on secure severs, and ongoing transparency and control over personal data are examples of privacy considerations that should be carefully addressed at the study design phase. Given the extent of personal and sensitive health information involved, prospective risk assessment using tools like the Digital Health Checklist for Researchers should be completed beforehand to evaluate risks and benefits and ensure safe and responsible use of digital tools [132,134]. Importantly, the development and enforcement of privacy standards that are applied consistently across studies will be key to the success of this burgeoning field [35].


Despite the numerous unresolved challenges and considerations outlined above, the potential for the digital phenotyping approach to yield ecologically valid and sensitive information on changes in everyday cognition is increasingly apparent. The benefits of emerging digital approaches are outlined in detail in the recent American Psychological Association Handbook of Neuropsychology [57]. To reiterate a few, sample size requirements are reduced when using continuous largescale data, and subtle fluctuations can be captured when data are sampled at such high rates, lending a highly sensitive scale that is captured in vivo. The use of personal smartphone devices represents a complex activity of daily living, thus creating an ideal platform to capture changes that occur early in the disease phase. Early detection of decline provides an opportunity for early intervention, which can lead to notable cost savings and reduced disability-adjusted life years, as noted earlier. Increased smartphone ownership lends increased accessibility compared to traditional methods. Passive data are objective and do not rely on current or retrospective self-report. However, it is possible that the most optimal application of this approach involves a blend of passive phenotyping, ecological momentary assessment for context, and burst cognitive testing to uncover the mechanisms of how changes in cognition within and across days relate to changes in behavior. Additionally, within-person n-of-1 designs may be increasingly sensitive and may address the interpretive challenges of between-groups designs. Finally, emerging digital methods should be considered complementary to traditional neuropsychological evaluations that remain the gold standard tool for diagnosis and intervention. If shown to be valid, emerging digital tools may represent a sensitive and accessible first line measure for early detection, risk stratification, and change in response to interventions.


Traditional approaches to measuring age-related changes in cognition and function provide valuable and distinct insights. Notable strengths of biomarker, traditional, and mobile cognitive assessments include extensive validation, measurement of discrete cognitive abilities, and localization of pathology (Table 1). At the same time, these approaches present many drawbacks that have become increasingly apparent in the face of technological advances that offer innovative solutions. The digital phenotyping approach is just 1 example of a novel tool that can serve as an increasingly accessible, efficient, sensitive, and personalized complement. Importantly, digital phenotyping remains in its infancy, and many methodological considerations warrant careful attention. Multiple sources of within-person differences (eg, hardware, software, technology habits, daily routines), as well as interpretive challenges of large-scale continuous datasets, make comparisons across individuals and across studies near impossible without a sound theoretical model from which to design and interpret such studies. The VIBE model, supported by decades of cross-discipline literature in neuropsychology, neurology, neuroscience, rehabilitation psychology, and computer science, proposes testable hypotheses (see Figure 1 and Table 3) that can be used in future digital phenotyping studies to provide novel, valuable, and clinically interpretable insights into meaningful changes in everyday behavior and cognition.

Acknowledgments

We acknowledge funding from the National Institutes of Health (NIH) and National Institute on Aging (NIA), including the following grants: F31AG069444 (author KH) and R21AG066771, R21AG060422, and R01AG062503 (author TG). We also acknowledge Ian Barnett for his cosponsorship of KH’s F31 grant, including consultation on digital phenotyping methodologies and related literature.

Authors' Contributions

KH and TG conceptualized the proposed framework and collaboratively reviewed the pertinent literature. KH drafted the manuscript. TG reviewed and provided conceptual and editorial advice for the manuscript. All authors read and approved the final manuscript.

Conflicts of Interest

None declared.

Multimedia Appendix 1

Supporting background literature.

DOCX File , 19 KB

  1. Branca C, Oddo S. Paving the way for new clinical trials for Alzheimer's disease. Biol Psychiatry 2017 Jan 15;81(2):88-89. [CrossRef] [Medline]
  2. Dementia key facts 2021. World Health Organization.   URL: https://www.who.int/news-room/fact-sheets/detail/dementia [accessed 2022-04-01]
  3. 2022 Alzheimer's disease facts and figures. Alzheimers Dement 2022 Apr 14;18(4):700-789. [CrossRef] [Medline]
  4. Schelke MW, Hackett K, Chen JL, Shih C, Shum J, Montgomery ME, et al. Nutritional interventions for Alzheimer's prevention: a clinical precision medicine approach. Ann N Y Acad Sci 2016 Mar;1367(1):50-56 [FREE Full text] [CrossRef] [Medline]
  5. Sperling RA, Aisen PS, Beckett LA, Bennett DA, Craft S, Fagan AM, et al. Toward defining the preclinical stages of Alzheimer's disease: recommendations from the National Institute on Aging-Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers Dement 2011 May;7(3):280-292 [FREE Full text] [CrossRef] [Medline]
  6. Sperling RA, Rentz DM, Johnson KA, Karlawish J, Donohue M, Salmon DP, et al. The A4 study: stopping AD before symptoms begin? Sci Transl Med 2014 Mar 19;6(228):228fs13 [FREE Full text] [CrossRef] [Medline]
  7. Petersen RC. Mild cognitive impairment as a diagnostic entity. J Intern Med 2004 Sep;256(3):183-194 [FREE Full text] [CrossRef] [Medline]
  8. Alzheimer's Association. 2019 Alzheimer's disease facts and figures. Alzheimers Dement 2019 Mar;15(3):321-387. [CrossRef]
  9. O'Bryant SE, Mielke MM, Rissman RA, Lista S, Vanderstichele H, Zetterberg H, et al. Blood-based biomarkers in Alzheimer disease: Current state of the science and a novel collaborative paradigm for advancing from discovery to clinic. Alzheimers Dement 2017 Jan 18;13(1):45-58 [FREE Full text] [CrossRef] [Medline]
  10. Jack CR, Bennett DA, Blennow K, Carrillo MC, Dunn B, Haeberlein SB, Contributors. NIA-AA Research Framework: Toward a biological definition of Alzheimer's disease. Alzheimers Dement 2018 Apr;14(4):535-562 [FREE Full text] [CrossRef] [Medline]
  11. McKeith IG, Boeve BF, Dickson DW, Halliday G, Taylor J, Weintraub D, et al. Diagnosis and management of dementia with Lewy bodies: Fourth consensus report of the DLB Consortium. Neurology 2017 Jul 04;89(1):88-100 [FREE Full text] [CrossRef] [Medline]
  12. Negash S, Wilson RS, Leurgans SE, Wolk DA, Schneider JA, Buchman AS, et al. Resilient brain aging: characterization of discordance between Alzheimer's disease pathology and cognition. Curr Alzheimer Res 2013 Oct;10(8):844-851 [FREE Full text] [CrossRef] [Medline]
  13. Brookmeyer R, Abdalla N. Estimation of lifetime risks of Alzheimer's disease dementia using biomarkers for preclinical disease. Alzheimers Dement 2018 May 22;14(8):981-988. [CrossRef]
  14. Glymour MM, Brickman AM, Kivimaki M, Mayeda ER, Chêne G, Dufouil C, et al. Will biomarker-based diagnosis of Alzheimer's disease maximize scientific progress? Evaluating proposed diagnostic criteria. Eur J Epidemiol 2018 Jul 9;33(7):607-612 [FREE Full text] [CrossRef] [Medline]
  15. Tochel C, Smith M, Baldwin H, Gustavsson A, Ly A, Bexelius C, ROADMAP consortium. What outcomes are important to patients with mild cognitive impairment or Alzheimer's disease, their caregivers, and health-care professionals? A systematic review. Alzheimers Dement 2019 Dec 09;11(1):231-247 [FREE Full text] [CrossRef] [Medline]
  16. Albert MS, DeKosky ST, Dickson D, Dubois B, Feldman HH, Fox NC, et al. The diagnosis of mild cognitive impairment due to Alzheimer's disease: recommendations from the National Institute on Aging-Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers Dement 2011 May;7(3):270-279 [FREE Full text] [CrossRef] [Medline]
  17. McKhann GM, Knopman DS, Chertkow H, Hyman BT, Jack CR, Kawas CH, et al. The diagnosis of dementia due to Alzheimer's disease: recommendations from the National Institute on Aging-Alzheimer's Association workgroups on diagnostic guidelines for Alzheimer's disease. Alzheimers Dement 2011 May;7(3):263-269 [FREE Full text] [CrossRef] [Medline]
  18. Kozauer N, Katz R. Regulatory innovation and drug development for early-stage Alzheimer's disease. N Engl J Med 2013 Mar 28;368(13):1169-1171. [CrossRef] [Medline]
  19. Langbaum JB, Hendrix SB, Ayutyanont N, Chen K, Fleisher AS, Shah RC, et al. An empirically derived composite cognitive test score with improved power to track and evaluate treatments for preclinical Alzheimer's disease. Alzheimers Dement 2014 Nov 21;10(6):666-674 [FREE Full text] [CrossRef] [Medline]
  20. Papp KV, Rentz DM, Orlovsky I, Sperling RA, Mormino EC. Optimizing the preclinical Alzheimer's cognitive composite with semantic processing: The PACC5. Alzheimers Dement 2017 Nov 09;3(4):668-677 [FREE Full text] [CrossRef] [Medline]
  21. Papp KV, Rentz DM, Mormino EC, Schultz AP, Amariglio RE, Quiroz Y, et al. Cued memory decline in biomarker-defined preclinical Alzheimer disease. Neurology 2017 Apr 11;88(15):1431-1438 [FREE Full text] [CrossRef] [Medline]
  22. Sliwinski MJ, Mogle JA, Hyun J, Munoz E, Smyth JM, Lipton RB. Reliability and Validity of Ambulatory Cognitive Assessments. Assessment 2018 Jan;25(1):14-30 [FREE Full text] [CrossRef] [Medline]
  23. Weizenbaum E, Torous J, Fulford D. Cognition in context: understanding the everyday predictors of cognitive performance in a new era of measurement. JMIR Mhealth Uhealth 2020 Jul 23;8(7):e14328. [CrossRef]
  24. Goldberg TE, Harvey PD, Wesnes KA, Snyder PJ, Schneider LS. Practice effects due to serial cognitive assessment: Implications for preclinical Alzheimer's disease randomized controlled trials. Alzheimers Dement 2015 Mar;1(1):103-111 [FREE Full text] [CrossRef] [Medline]
  25. Miller J, Barr W. The technology crisis in neuropsychology. Arch Clin Neuropsychol 2017 Aug 01;32(5):541-554. [CrossRef] [Medline]
  26. Buckley RF, Sparks KP, Papp KV, Dekhtyar M, Martin C, Burnham S, et al. Computerized Cognitive Testing for Use in Clinical Trials: a comparison of the NIH Toolbox and Cogstate C3 batteries. J Prev Alzheimers Dis 2017;4(1):3-11 [FREE Full text] [CrossRef] [Medline]
  27. Öhman F, Hassenstab J, Berron D, Schöll M, Papp KV. Current advances in digital cognitive assessment for preclinical Alzheimer's disease. Alzheimers Dement 2021 Jul 20;13(1):e12217 [FREE Full text] [CrossRef] [Medline]
  28. Wild K, Howieson D, Webbe F, Seelye A, Kaye J. Status of computerized cognitive testing in aging: a systematic review. Alzheimers Dement 2008 Nov;4(6):428-437 [FREE Full text] [CrossRef] [Medline]
  29. Ross MK, Demos AP, Zulueta J, Piscitello A, Langenecker SA, McInnis M, et al. Naturalistic smartphone keyboard typing reflects processing speed and executive function. Brain Behav 2021 Nov 06;11(11):e2363 [FREE Full text] [CrossRef] [Medline]
  30. Heaton R, Miller S, Taylor M, Grant I. Revised Comprehensive Norms for an Expanded Halstead-Reitan Battery: Demographically Adjusted Neuropsychological Norms for African American and Caucasian Adults. Lutz, FL: Psychological Assessment Resources; 2004.
  31. Manly JJ, Jacobs DM, Touradji P, Small SA, Stern Y. Reading level attenuates differences in neuropsychological test performance between African American and White elders. J Int Neuropsychol Soc 2002 Mar 16;8(3):341-348. [CrossRef] [Medline]
  32. Manly JJ, Byrd DA, Touradji P, Stern Y. Acculturation, reading level, and neuropsychological test performance among African American elders. Appl Neuropsychol 2004 Mar;11(1):37-46. [CrossRef] [Medline]
  33. Hassenstab J, Aschenbrenner AJ, Balota DA, McDade E, Lim YY, Fagan AM, et al. Remote cognitive assessment approaches in the Dominantly Inherited Alzheimer Network (DIAN). Alzheimers Dement 2020 Dec 07;16(S6). [CrossRef]
  34. Pratap A, Neto EC, Snyder P, Stepnowsky C, Elhadad N, Grant D, et al. Indicators of retention in remote digital health studies: a cross-study evaluation of 100,000 participants. NPJ Digit Med 2020 Feb 17;3(1):21 [FREE Full text] [CrossRef] [Medline]
  35. Insel TR. Digital phenotyping: technology for a new science of behavior. JAMA 2017 Oct 03;318(13):1215-1216. [CrossRef] [Medline]
  36. Jain SH, Powers BW, Hawkins JB, Brownstein JS. The digital phenotype. Nat Biotechnol 2015 May;33(5):462-463. [CrossRef] [Medline]
  37. Torous J, Kiang MV, Lorme J, Onnela J. New tools for new research in psychiatry: a scalable and customizable platform to empower data driven smartphone research. JMIR Ment Health 2016 May 05;3(2):e16 [FREE Full text] [CrossRef] [Medline]
  38. Fyffe D, Mukherjee S, Barnes L, Manly J, Bennett D, Crane P. Explaining differences in episodic memory performance among older African Americans and whites: the roles of factors related to cognitive reserve and test bias. J Int Neuropsychol Soc 2011 May 06;17(4):625-638. [CrossRef]
  39. Byrd DA, Rivera-Mindt MG. Neuropsychology's race problem does not begin or end with demographically adjusted norms. Nat Rev Neurol 2022 Mar;18(3):125-126. [CrossRef] [Medline]
  40. Dodge HH, Zhu J, Mattek NC, Austin D, Kornfeld J, Kaye JA. Use of high-frequency in-home monitoring data may reduce sample sizes needed in clinical trials. PLoS One 2015 Sep 17;10(9):e0138095 [FREE Full text] [CrossRef] [Medline]
  41. Onnela J, Arbesman S, González MC, Barabási AL, Christakis NA. Geographic constraints on social network groups. PLoS One 2011 Apr 05;6(4):e16939 [FREE Full text] [CrossRef] [Medline]
  42. Onnela J, Rauch SL. Harnessing smartphone-based digital phenotyping to enhance behavioral and mental health. Neuropsychopharmacology 2016 Dec;41(7):1691-1696 [FREE Full text] [CrossRef] [Medline]
  43. Saeb S, Zhang M, Kwasny MM, Karr CJ, Kording K, Mohr DC. The relationship between clinical, momentary, and sensor-based assessment of depression. Int Conf Pervasive Comput Technol Healthc 2015 Aug;2015 [FREE Full text] [CrossRef] [Medline]
  44. Zulueta J, Piscitello A, Rasic M, Easter R, Babu P, Langenecker SA, et al. Predicting mood disturbance severity with mobile phone keystroke metadata: a BiAffect digital phenotyping study. J Med Internet Res 2018 Dec 20;20(7):e241 [FREE Full text] [CrossRef] [Medline]
  45. Vesel C, Rashidisabet H, Zulueta J, Stange JP, Duffecy J, Hussain F, et al. Effects of mood and aging on keystroke dynamics metadata and their diurnal patterns in a large open-science sample: A BiAffect iOS study. J Am Med Inform Assoc 2020 Jul 01;27(7):1007-1018. [CrossRef] [Medline]
  46. Faherty LJ, Hantsoo L, Appleby D, Sammel MD, Bennett IM, Wiebe DJ. Movement patterns in women at risk for perinatal depression: use of a mood-monitoring mobile application in pregnancy. J Am Med Inform Assoc 2017 Jul 01;24(4):746-753. [CrossRef] [Medline]
  47. Ben-Zeev D, Scherer EA, Wang R, Xie H, Campbell AT. Next-generation psychiatric assessment: Using smartphone sensors to monitor behavior and mental health. Psychiatr Rehabil J 2015 Sep;38(3):218-226 [FREE Full text] [CrossRef] [Medline]
  48. Barnett I, Torous J, Staples P, Sandoval L, Keshavan M, Onnela J. Relapse prediction in schizophrenia through digital phenotyping: a pilot study. Neuropsychopharmacology 2018 Jul;43(8):1660-1666. [CrossRef] [Medline]
  49. Torous J, Staples P, Barnett I, Sandoval LR, Keshavan M, Onnela J. Characterizing the clinical relevance of digital phenotyping data quality with applications to a cohort with schizophrenia. NPJ Digit Med 2018 Apr 6;1(1):1-9 [FREE Full text] [CrossRef] [Medline]
  50. Wang R, Aung M, Abdullah S, Brian R, Campbell A, Choudhury T, et al. CrossCheck: toward passive sensing and detection of mental health changes in people with schizophrenia. 2016 Presented at: ACM International Joint Conference on Pervasive and Ubiquitous Computing; September 12-16; Heidelberg, Germany p. 886-897. [CrossRef]
  51. Berry JD, Paganoni S, Carlson K, Burke K, Weber H, Staples P, et al. Design and results of a smartphone-based digital phenotyping study to quantify ALS progression. Ann Clin Transl Neurol 2019 May;6(5):873-881 [FREE Full text] [CrossRef] [Medline]
  52. Murnane E, Abdullah S, Matthews M, Kay M, Kientz J, Choudhury T, et al. Mobile Manifestations of Alertness: Connecting Biological Rhythms with Patterns of Smartphone App Use. In: Proceedings of the 18th international conference on human-computer interaction with mobile devices and services. USA: ACM; 2016 Sep Presented at: MobileHCI '16; September 2016; Florence, Italy p. 465-477   URL: https://europepmc.org/abstract/MED/30931436 [CrossRef]
  53. Dagum P. Digital biomarkers of cognitive function. NPJ Digit Med 2018 Mar 28;1(1):10 [FREE Full text] [CrossRef] [Medline]
  54. Gordon M, Gatys L, Guestrin C, Bigham J, Trister A, Patel K. App usage predicts cognitive ability in older adults. In: Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems Internet. USA: ACM; 2019 Presented at: CHI '19; May 2019; Glasgow, UK p. 1-12. [CrossRef]
  55. Chen R, Jankovic F, Marinsek N, Foschini L, Kourtis L, Signorini A, et al. Developing measures of cognitive impairment in the real world from consumer-grade multimodal sensor streams. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019 Jul Presented at: KDD '19; July 25; Anchorage p. 2145-2155. [CrossRef]
  56. Onnela J. Opportunities and challenges in the collection and analysis of digital phenotyping data. Neuropsychopharmacology 2021 Jan 17;46(1):45-54 [FREE Full text] [CrossRef] [Medline]
  57. Giovannetti T, Hackett K, Tassoni M, Mis R, Simone S. Assessing and predicting everyday function. In: Gregory B, Bruce C, Kathleen H, Tricia K, editors. APA Handbook of Neuropsychology. Washington, DC: American Psychological Association; Nov 2022.
  58. Germine L, Reinecke K, Chaytor NS. Digital neuropsychology: Challenges and opportunities at the intersection of science and software. Clin Neuropsychol 2019 Feb;33(2):271-286. [CrossRef] [Medline]
  59. Tucker-Drob EM. Neurocognitive functions and everyday functions change together in old age. Neuropsychology 2011 May;25(3):368-377 [FREE Full text] [CrossRef] [Medline]
  60. Hultsch D, Strauss E, Hunter M, MacDonald S. Intraindividual variability, cognition, and aging. In: The Handbook of Aging and Cognition. Hove, England: Psychology Press; 2011:491-556.
  61. Hultsch DF, MacDonald SWS, Dixon RA. Variability in reaction time performance of younger and older adults. J Gerontol B Psychol Sci Soc Sci 2002 Mar 01;57(2):P101-P115. [CrossRef] [Medline]
  62. Buckner RL. Memory and executive function in aging and AD: multiple factors that cause decline and reserve factors that compensate. Neuron 2004 Sep 30;44(1):195-208 [FREE Full text] [CrossRef] [Medline]
  63. Raz N, Lindenberger U, Rodrigue KM, Kennedy KM, Head D, Williamson A, et al. Regional brain changes in aging healthy adults: general trends, individual differences and modifiers. Cereb Cortex 2005 Nov;15(11):1676-1689 [FREE Full text] [CrossRef] [Medline]
  64. Bondi MW, Edmonds EC, Jak AJ, Clark LR, Delano-Wood L, McDonald CR, et al. Neuropsychological criteria for mild cognitive impairment improves diagnostic precision, biomarker associations, and progression rates. JAD 2014 Aug 11;42(1):275-289. [CrossRef]
  65. Jak AJ, Bondi MW, Delano-Wood L, Wierenga C, Corey-Bloom J, Salmon DP, et al. Quantification of five neuropsychological approaches to defining mild cognitive impairment. Am J Geriatr Psychiatry 2009 May;17(5):368-375 [FREE Full text] [CrossRef] [Medline]
  66. Edmonds EC, Delano-Wood L, Galasko DR, Salmon DP, Bondi MW. Subtle cognitive decline and biomarker staging in preclinical Alzheimer's disease. J Alzheimers Dis 2015;47(1):231-242 [FREE Full text] [CrossRef] [Medline]
  67. Lövdén M, Li S, Shing YL, Lindenberger U. Within-person trial-to-trial variability precedes and predicts cognitive decline in old and very old age: Longitudinal data from the Berlin Aging Study. Neuropsychologia 2007;45(12):2827-2838. [CrossRef]
  68. MacDonald SW, Nyberg L, Bäckman L. Intra-individual variability in behavior: links to brain structure, neurotransmission and neuronal activity. Trends Neurosci 2006 Aug;29(8):474-480. [CrossRef] [Medline]
  69. Troyer AK, Vandermorris S, Murphy KJ. Intraindividual variability in performance on associative memory tasks is elevated in amnestic mild cognitive impairment. Neuropsychologia 2016 Sep;90:110-116. [CrossRef] [Medline]
  70. MacDonald SWS, Hultsch DF, Dixon RA. Performance variability is related to change in cognition: evidence from the Victoria Longitudinal Study. Psychol Aging 2003 Sep;18(3):510-523. [CrossRef] [Medline]
  71. Christensen H, Mackinnon AJ, Korten A, Jorm A, Henderson A, Jacomb P. Dispersion in cognitive ability as a function of age: a longitudinal study of an elderly community sample. Neuropsychol Dev Cogn B 2010 Aug 09;6(3):214-228. [CrossRef]
  72. Holtzer R, Verghese J, Wang C, Hall CB, Lipton RB. Within-person across-neuropsychological test variability and incident dementia. JAMA 2008 Aug 20;300(7):823-830 [FREE Full text] [CrossRef] [Medline]
  73. Roalf DR, Quarmley M, Mechanic-Hamilton D, Wolk DA, Arnold SE, Moberg PJ. Within-individual variability: an index for subtle change in neurocognition in mild cognitive impairment. J Alzheimers Dis 2016 Aug 10;54(1):325-335 [FREE Full text] [CrossRef] [Medline]
  74. Bangen KJ, Weigand AJ, Thomas KR, Delano-Wood L, Clark LR, Eppig J, et al. Cognitive dispersion is a sensitive marker for early neurodegenerative changes and functional decline in nondemented older adults. Neuropsychology 2019 Jul;33(5):599-608 [FREE Full text] [CrossRef] [Medline]
  75. Farias ST, Mungas D, Reed BR, Harvey D, Cahn-Weiner D, Decarli C. MCI is associated with deficits in everyday functioning. Alzheimer Dis Assoc Disord 2006;20(4):217-223 [FREE Full text] [CrossRef] [Medline]
  76. Jekel K, Damian M, Wattmo C, Hausner L, Bullock R, Connelly PJ, et al. Mild cognitive impairment and deficits in instrumental activities of daily living: a systematic review. Alzheimers Res Ther 2015 Mar 18;7(1):17 [FREE Full text] [CrossRef] [Medline]
  77. Schmitter-Edgecombe M, McAlister C, Weakley A. Naturalistic assessment of everyday functioning in individuals with mild cognitive impairment: the day-out task. Neuropsychology 2012 Sep;26(5):631-641 [FREE Full text] [CrossRef] [Medline]
  78. Schmitter-Edgecombe M, Parsey CM. Cognitive correlates of functional abilities in individuals with mild cognitive impairment: comparison of questionnaire, direct observation, and performance-based measures. Clin Neuropsychol 2014;28(5):726-746 [FREE Full text] [CrossRef] [Medline]
  79. Triebel KL, Martin R, Griffith HR, Marceaux J, Okonkwo OC, Harrell L, et al. Declining financial capacity in mild cognitive impairment: A 1-year longitudinal study. Neurology 2009 Sep 22;73(12):928-934 [FREE Full text] [CrossRef] [Medline]
  80. Fauth EB, Schwartz S, Tschanz JT, Østbye T, Corcoran C, Norton MC. Baseline disability in activities of daily living predicts dementia risk even after controlling for baseline global cognitive ability and depressive symptoms. Int J Geriatr Psychiatry 2013 Jun 12;28(6):597-606 [FREE Full text] [CrossRef] [Medline]
  81. Luck T, Luppa M, Angermeyer MC, Villringer A, König H, Riedel-Heller SG. Impact of impairment in instrumental activities of daily living and mild cognitive impairment on time to incident dementia: results of the Leipzig Longitudinal Study of the Aged. Psychol Med 2011 May;41(5):1087-1097. [CrossRef] [Medline]
  82. Giovannetti T, Bettcher BM, Brennan L, Libon DJ, Burke M, Duey K, et al. Characterization of everyday functioning in mild cognitive impairment: a direct assessment approach. Dement Geriatr Cogn Disord 2008;25(4):359-365. [CrossRef] [Medline]
  83. Dodge HH, Mattek NC, Austin D, Hayes TL, Kaye JA. In-home walking speeds and variability trajectories associated with mild cognitive impairment. Neurology 2012 Jun 12;78(24):1946-1952 [FREE Full text] [CrossRef] [Medline]
  84. Akl A, Taati B, Mihailidis A. Autonomous unobtrusive detection of mild cognitive impairment in older adults. IEEE Trans Biomed Eng 2015 May;62(5):1383-1394. [CrossRef]
  85. Gillain S, Dramé M, Lekeu F, Wojtasik V, Ricour C, Croisier J, et al. Gait speed or gait variability, which one to use as a marker of risk to develop Alzheimer disease? A pilot study. Aging Clin Exp Res 2016 Apr;28(2):249-255. [CrossRef] [Medline]
  86. Hayes TL, Abendroth F, Adami A, Pavel M, Zitzelberger TA, Kaye JA. Unobtrusive assessment of activity patterns associated with mild cognitive impairment. Alzheimers Dement 2008 Nov;4(6):395-405 [FREE Full text] [CrossRef] [Medline]
  87. Montero-Odasso M, Casas A, Hansen KT, Bilski P, Gutmanis I, Wells JL, et al. Quantitative gait analysis under dual-task in older people with mild cognitive impairment: a reliability study. J Neuroeng Rehabil 2009 Sep 21;6(1):35 [FREE Full text] [CrossRef] [Medline]
  88. Verghese J, Robbins M, Holtzer R, Zimmerman M, Wang C, Xue X, et al. Gait dysfunction in mild cognitive impairment syndromes. J Am Geriatr Soc 2008 Jul;56(7):1244-1251 [FREE Full text] [CrossRef] [Medline]
  89. Verghese J, Wang C, Lipton RB, Holtzer R, Xue X. Quantitative gait dysfunction and risk of cognitive decline and dementia. J Neurol Neurosurg Psychiatry 2007 Sep;78(9):929-935 [FREE Full text] [CrossRef] [Medline]
  90. Benson RR, Guttmann CRG, Wei X, Warfield SK, Hall C, Schmidt JA, et al. Older people with impaired mobility have specific loci of periventricular abnormality on MRI. Neurology 2002 Jan 08;58(1):48-55. [CrossRef] [Medline]
  91. Onen F, Henry-Feugeas M, Roy C, Baron G, Ravaud P. Mobility decline of unknown origin in mild cognitive impairment: an MRI-based clinical study of the pathogenesis. Brain Res 2008 Jul 30;1222:79-86. [CrossRef] [Medline]
  92. Silbert LC, Nelson C, Howieson DB, Moore MM, Kaye JA. Impact of white matter hyperintensity volume progression on rate of cognitive and motor decline. Neurology 2008 Jul 08;71(2):108-113 [FREE Full text] [CrossRef] [Medline]
  93. Austin J, Klein K, Mattek N, Kaye J. Variability in medication taking is associated with cognitive performance in nondemented older adults. Alzheimers Dement (Amst) 2017;6:210-213 [FREE Full text] [CrossRef] [Medline]
  94. Seelye A, Hagler S, Mattek N, Howieson DB, Wild K, Dodge HH, et al. Computer mouse movement patterns: A potential marker of mild cognitive impairment. Alzheimers Dement (Amst) 2015 Dec 01;1(4):472-480 [FREE Full text] [CrossRef] [Medline]
  95. Reckess G, Varvaris M, Gordon B, Schretlen D. Within-person distributions of neuropsychological test scores as a function of dementia severity. Neuropsychology 2014 Mar;28(2):254-260. [CrossRef] [Medline]
  96. Hultsch DF, MacDonald SWS, Hunter MA, Levy-Bencheton J, Strauss E. Intraindividual variability in cognitive performance in older adults: Comparison of adults with mild dementia, adults with arthritis, and healthy adults. Neuropsychology 2000 Oct;14(4):588-598. [CrossRef]
  97. Giebel CM, Sutcliffe C, Challis D. Activities of daily living and quality of life across different stages of dementia: a UK study. Aging Ment Health 2015 Jan;19(1):63-71. [CrossRef] [Medline]
  98. Fitz A, Teri L. Depression, cognition, and functional ability in patients with Alzheimer's disease. J Am Geriatr Soc 1994 Feb;42(2):186-191. [CrossRef] [Medline]
  99. Moon H, Rote S, Haley WE. Factors that contribute to remaining in the community among older adults. Aging Ment Health 2018 Nov;22(11):1502-1509. [CrossRef] [Medline]
  100. Sajjad A, Freak-Poli RL, Hofman A, Roza SJ, Ikram MA, Tiemeier H. Subjective measures of health and all-cause mortality – the Rotterdam Study. Psychol Med 2017 Mar 13;47(11):1971-1980. [CrossRef]
  101. Lin C, Shih P, Ku LE. Activities of daily living function and neuropsychiatric symptoms of people with dementia and caregiver burden: The mediating role of caregiving hours. Arch Gerontol Geriatr 2019 Mar;81:25-30. [CrossRef] [Medline]
  102. Giovannetti T, Bettcher BM, Brennan L, Libron DJ, Kessler RK, Duey K. Coffee with jelly or unbuttered toast: commissions and omissions are dissociable aspects of everyday action impairment in Alzheimer's disease. Neuropsychology 2008 Mar;22(2):235-245. [CrossRef] [Medline]
  103. Kessler R, Giovannetti T, MacMullen L. Everyday action in schizophrenia: performance patterns and underlying cognitive mechanisms. Neuropsychology 2007 Jul;21(4):439-447. [CrossRef] [Medline]
  104. Rusted J, Sheppard L. Action-based memory in Alzheimer’s disease: a longitudinal look at tea making. Neurocase 2002;8(1):111-126. [CrossRef]
  105. Schmitter-Edgecombe M, Parsey CM. Assessment of functional change and cognitive correlates in the progression from healthy cognitive aging to dementia. Neuropsychology 2014 Nov;28(6):881-893 [FREE Full text] [CrossRef] [Medline]
  106. Suzuki T, Murase S, Tanaka T, Okazawa T. New approach for the early detection of dementia by recording in-house activities. Telemed J E Health 2007 Feb;13(1):41-44. [CrossRef] [Medline]
  107. Mattek N, Thomas NW, Sharma N, Beattie Z, Marcoe J, Riley T, et al. TD‐P‐17: home‐based digital activity biomarkers remotely monitor relevant activities of MCI and Alzheimer’s disease patients and their care partners. Alzheimer's Dement 2019 Jul;15(7S_Part_3):160. [CrossRef]
  108. Herlihy M, Wing J. Specifying graceful degradation. IEEE Trans. Parallel Distrib. Syst 1991 Jan;2(1):93-104. [CrossRef]
  109. Giovannetti T, Mis R, Hackett K, Simone S, Ungrady M. The goal-control model: An integrated neuropsychological framework to explain impaired performance of everyday activities. Neuropsychology 2021 Jan;35(1):3-18 [FREE Full text] [CrossRef] [Medline]
  110. Li S, Lindenberger U, Sikström S. Aging cognition: from neuromodulation to representation. Trends Cogn Sci 2001 Nov;5(11):479-486. [CrossRef]
  111. Li S, Brehmer Y, Shing YL, Werkle-Bergner M, Lindenberger U. Neuromodulation of associative and organizational plasticity across the life span: empirical evidence and neurocomputational modeling. Neurosci Biobehav Rev 2006 Jan;30(6):775-790. [CrossRef] [Medline]
  112. MacDonald SW, Cervenka S, Farde L, Nyberg L, Bäckman L. Extrastriatal dopamine D2 receptor binding modulates intraindividual variability in episodic recognition and executive functioning. Neuropsychologia 2009 Sep;47(11):2299-2304. [CrossRef] [Medline]
  113. Gleason CE, Norton D, Anderson ED, Wahoske M, Washington DT, Umucu E, et al. Cognitive variability predicts incident Alzheimer's disease and mild cognitive impairment comparable to a cerebrospinal fluid biomarker. J Alzheimers Dis 2018;61(1):79-89 [FREE Full text] [CrossRef] [Medline]
  114. Kelly AMC, Uddin LQ, Biswal BB, Castellanos FX, Milham MP. Competition between functional brain networks mediates behavioral variability. Neuroimage 2008 Jan 01;39(1):527-537. [CrossRef] [Medline]
  115. West R, Murphy KJ, Armilio ML, Craik FIM, Stuss DT. Lapses of intention and performance variability reveal age-related increases in fluctuations of executive control. Brain Cogn 2002 Aug;49(3):402-419. [CrossRef] [Medline]
  116. Schwartz MF, Montgomery MW, Buxbaum LJ, Lee SS, Carew TG, Coslett HB, et al. Naturalistic action impairment in closed head injury. Neuropsychology 1998;12(1):13-28. [CrossRef]
  117. Bassuk SS, Glass TA, Berkman LF. Social disengagement and incident cognitive decline in community-dwelling elderly persons. Ann Intern Med 1999 Aug 03;131(3):165-173. [CrossRef] [Medline]
  118. Dickinson WJ, Potter GG, Hybels CF, McQuoid DR, Steffens DC. Change in stress and social support as predictors of cognitive decline in older adults with and without depression. Int J Geriatr Psychiatry 2011 Dec 02;26(12):1267-1274 [FREE Full text] [CrossRef] [Medline]
  119. Kaye J, Mattek N, Dodge HH, Campbell I, Hayes T, Austin D, et al. Unobtrusive measurement of daily computer use to detect mild cognitive impairment. Alzheimers Dement 2014 Jan;10(1):10-17 [FREE Full text] [CrossRef] [Medline]
  120. Bernstein JPK, Dorociak K, Mattek N, Leese M, Trapp C, Beattie Z, et al. Unobtrusive, in-home assessment of older adults' everyday activities and health events: associations with cognitive performance over a brief observation period. Neuropsychol Dev Cogn B Aging Neuropsychol Cogn 2021 Apr 18:1-18. [CrossRef] [Medline]
  121. Chodosh J, Kado DM, Seeman TE, Karlamangla AS. Depressive symptoms as a predictor of cognitive decline: MacArthur studies of successful aging. Am J Geriatr Psychiatry 2007 May;15(5):406-415. [CrossRef]
  122. Gallaway P, Miyake H, Buchowski M, Shimada M, Yoshitake Y, Kim A, et al. Physical activity: a viable way to reduce the risks of mild cognitive impairment, Alzheimer's disease, and vascular dementia in older adults. Brain Sci 2017 Feb 20;7(2):22 [FREE Full text] [CrossRef] [Medline]
  123. Hilborn JV, Strauss E, Hultsch DF, Hunter MA. Intraindividual variability across cognitive domains: investigation of dispersion levels and performance profiles in older adults. J Clin Exp Neuropsychol 2009 May;31(4):412-424. [CrossRef] [Medline]
  124. Vaidyam A, Halamka J, Torous J. Actionable digital phenotyping: a framework for the delivery of just-in-time and longitudinal interventions in clinical healthcare. Mhealth 2019;5:25 [FREE Full text] [CrossRef] [Medline]
  125. Mobile fact sheet internet. Pew Research Center. 2021.   URL: https://www.pewresearch.org/internet/fact-sheet/mobile/ [accessed 2022-04-01]
  126. Barnett I, Torous J, Staples P, Keshavan M, Onnela J. Beyond smartphones and sensors: choosing appropriate statistical methods for the analysis of longitudinal data. J Am Med Inform Assoc 2018 Dec 01;25(12):1669-1674. [CrossRef] [Medline]
  127. Barnett I, Torous J, Reeder H, Baker J, Onnela JP. Determining sample size and length of follow-up for smartphone-based digital phenotyping studies. J Am Med Inform Assoc 2020 Dec 09;27(12):1844-1849 [FREE Full text] [CrossRef] [Medline]
  128. Kourtis LC, Regele OB, Wright JM, Jones GB. Digital biomarkers for Alzheimer's disease: the mobile/ wearable devices opportunity. NPJ Digit Med 2019;2 [FREE Full text] [CrossRef] [Medline]
  129. Taylor KI, Staunton H, Lipsmeier F, Nobbs D, Lindemann M. Outcome measures based on digital health technology sensor data: data- and patient-centric approaches. NPJ Digit Med 2020 Jul 23;3(1):97 [FREE Full text] [CrossRef] [Medline]
  130. Teipel S, König A, Hoey J, Kaye J, Krüger F, Robillard JM, et al. Use of nonintrusive sensor-based information and communication technology for real-world evidence for clinical trials in dementia. Alzheimers Dement 2018 Sep;14(9):1216-1231 [FREE Full text] [CrossRef] [Medline]
  131. Real-world data and real-world evidence supporting clinical decision making. Professional Society for Health Economics and Outcomes Research.   URL: https:/​/www.​ispor.org/​images/​default-source/​strategic-initiative/​pfizer-bms-ispor-infographic_finalf1e4521f586d4354b69ff9700bd2fb93.​jpg?sfvrsn=508911d8_2 [accessed 2022-04-01]
  132. Bartlett Ellis R, Wright J, Miller LS, Jake-Schoffman D, Hekler EB, Goldstein CM, et al. Lessons learned: beta-testing the Digital Health Checklist for Researchers prompts a call to action by behavioral scientists. J Med Internet Res 2021 Dec 22;23(12):e25414 [FREE Full text] [CrossRef] [Medline]
  133. Policy for device software functions and mobile medical applications. Guidance for industry and Food and Drug Administration staff. US Food and Drug Administration. 2019 Sep.   URL: https://www.fda.gov/media/80958/download [accessed 2022-04-01]
  134. Nebeker C, Bartlett Ellis RJ, Torous J. Development of a decision-making checklist tool to support technology selection in digital health research. Transl Behav Med 2020 Oct 08;10(4):1004-1015 [FREE Full text] [CrossRef] [Medline]


AD: Alzheimer’s disease
ADRD: Alzheimer’s disease and related dementias
CSF: cerebrospinal fluid
MCI: mild cognitive impairment
NIA: National Institute on Aging
NIH: National Institutes of Health
VIBE: Variability in Everyday Behavior


Edited by T Leung; submitted 21.03.22; peer-reviewed by E Weizenbaum, G Jones; comments to author 29.06.22; revised version received 19.07.22; accepted 31.07.22; published 07.09.22

Copyright

©Katherine Hackett, Tania Giovannetti. Originally published in JMIR Aging (https://aging.jmir.org), 07.09.2022.

This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Aging, is properly cited. The complete bibliographic information, a link to the original publication on https://aging.jmir.org, as well as this copyright and license information must be included.