STAR-Lite: A light-weight scalable self-taught learning framework for older adults’ activity recognition

2022 
Recognizing activities for older adults is challenging as we observe a variety of activity patterns caused due to aging (e.g., limited dexterity, limb control, slower response time) or/and underlying health conditions (e.g., dementia). However, existing literature with deep learning methods has successfully recognized activities when the dataset contains high-quality annotations and is captured in a controlled environment. On the contrary, data captured in a real-world environment, especially with older adults exhibiting memory-related symptoms, varying psychological and mental health status, reliance on caregivers to perform daily activities, and unavailability of domain-specific annotators, makes obtaining quality data with annotations challenging; leaving us with limited labeled data and abundant unlabeled data. In this paper, we hypothesize that projecting the labeled data representations comprising a specific set of activities onto a new representation space characterized by the unlabeled data comprising activities beyond the limited activities in the labeled dataset would help us rely less on the annotated data to improve activity detection performance. Motivated by this, we propose , a self-taught learning framework that involves a pre-training framework to prepare the new representation space considering activities beyond the initial labels in the labeled dataset. projects the labeled data representations on the new representation space characterized by unlabeled data labels and learns higher-level representations of the labeled dataset while optimizing inter- and intra- class distances without explicitly using a computation hungry similarity-based approach. We demonstrate that our proposed approach, (a) improves activity recognition performance in a supervised setting and (b) is feasible for real-world deployment. To enhance the feasibility of deploying on devices with limited memory resources, we explore model compression techniques such as pruning and quantization and propose a novel layer-wise pruning-rate optimization technique that effectively compresses the network while preserving the model performance. The evaluation was performed using the Alzheimer’s Activity Recognition dataset (AAR) captured from 25 individuals living in a retirement community center with IRB approval (#Y18NR12035) using an in-house infrastructure while concurrently assessing the clinical evaluation of the participants for dementia, and independent living. Our extensive evaluation reveals that can detect activities with an F1-score of 85.12% despite 62% reduction in model size and 5% improvement of execution time on a resource constrained device.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []