Optimizing the unsupervised assessment of cognition in web‐based registries of individuals at risk for Alzheimer's disease: Results from the Healthy Brain Project: Neuropsychology/computerized neuropsychological assessment. (7th December 2020)
- Record Type:
- Journal Article
- Title:
- Optimizing the unsupervised assessment of cognition in web‐based registries of individuals at risk for Alzheimer's disease: Results from the Healthy Brain Project: Neuropsychology/computerized neuropsychological assessment. (7th December 2020)
- Main Title:
- Optimizing the unsupervised assessment of cognition in web‐based registries of individuals at risk for Alzheimer's disease: Results from the Healthy Brain Project
- Authors:
- Lim, Yen Ying
Perin, Stephanie
Buckley, Rachel F.
Pase, Matthew P.
Yassi, Nawaf
Lavale, Alexandra
Schembri, Adrian
Maruff, Paul T. - Abstract:
- Abstract: Background: Web‐based platforms are used increasingly to assess cognitive function in unsupervised settings. However, methods for ensuring the validity of cognitive data arising from unsupervised assessments are limited. We applied Human Computer Interaction (HCI) concepts of acceptability and usability to examine the validity of unsupervised cognitive testing in middle‐aged adults enrolled in the Healthy Brain Project (HBP). Method: A total of 1594 middle‐aged adults (mean age 56 years, SD 6.60 years) completed unsupervised assessments using a self‐administered version of the Cogstate Brief Battery (CBB) via our online platform, healthybrainproject.org.au. HCI acceptability was defined by the nature and amount of missing data, and HCI usability as (a) errors made during test performance, and (b) the time taken to read test instructions and complete the tests (learnability). We also explored whether cognitive performance varied across different testing environments (e.g., home/work alone, home/work with others around, public space). Result: Overall, we observed high acceptability (98% complete data) and high usability (95% met criteria for low error rates and high learnability). Test validity was confirmed by observation of expected inverse relationships between performance and increasing test difficulty and age. When accounting for the effects of age, no general influence of testing environment was found for any of the cognitive outcome measures in this study.Abstract: Background: Web‐based platforms are used increasingly to assess cognitive function in unsupervised settings. However, methods for ensuring the validity of cognitive data arising from unsupervised assessments are limited. We applied Human Computer Interaction (HCI) concepts of acceptability and usability to examine the validity of unsupervised cognitive testing in middle‐aged adults enrolled in the Healthy Brain Project (HBP). Method: A total of 1594 middle‐aged adults (mean age 56 years, SD 6.60 years) completed unsupervised assessments using a self‐administered version of the Cogstate Brief Battery (CBB) via our online platform, healthybrainproject.org.au. HCI acceptability was defined by the nature and amount of missing data, and HCI usability as (a) errors made during test performance, and (b) the time taken to read test instructions and complete the tests (learnability). We also explored whether cognitive performance varied across different testing environments (e.g., home/work alone, home/work with others around, public space). Result: Overall, we observed high acceptability (98% complete data) and high usability (95% met criteria for low error rates and high learnability). Test validity was confirmed by observation of expected inverse relationships between performance and increasing test difficulty and age. When accounting for the effects of age, no general influence of testing environment was found for any of the cognitive outcome measures in this study. Testing environment also had no significant effect on the proportion of individuals who satisfied our pre‐specified acceptability and usability criteria. Conclusion: With HCI acceptability and usability criteria considered, the data collected in this study retains similar psychometric characteristics to those collected from supervised testing of the same tests. The HCI definitions of acceptability and usability show great promise for use as real time algorithms for the collection of valid indices of cognition in unsupervised settings. This is of particular salience when considering the development of registries of individuals at risk for Alzheimer's disease. … (more)
- Is Part Of:
- Alzheimer's & dementia. Volume 16(2020)Supplement 6
- Journal:
- Alzheimer's & dementia
- Issue:
- Volume 16(2020)Supplement 6
- Issue Display:
- Volume 16, Issue 6 (2020)
- Year:
- 2020
- Volume:
- 16
- Issue:
- 6
- Issue Sort Value:
- 2020-0016-0006-0000
- Page Start:
- n/a
- Page End:
- n/a
- Publication Date:
- 2020-12-07
- Subjects:
- Alzheimer's disease -- Periodicals
Alzheimer Disease -- Periodicals
Dementia -- Periodicals
Démence
Maladie d'Alzheimer
Périodique électronique (Descripteur de forme)
Ressource Internet (Descripteur de forme)
616.83 - Journal URLs:
- http://www.sciencedirect.com/science/journal/15525260 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1002/alz.044726 ↗
- Languages:
- English
- ISSNs:
- 1552-5260
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 0806.255333
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 21898.xml