The CNN Hip Accelerometer Posture (CHAP) Method for Classifying Sitting Patterns from Hip Accelerometers: A Validation Study.

2021 
Introduction Sitting patterns predict several healthy aging outcomes. These patterns can potentially be measured using hip-worn accelerometers, but current methods are limited by an inability to detect postural transitions. To overcome these limitations, we developed the Convolutional Neural Network Hip Accelerometer Posture (CHAP) classification method. Methods CHAP was developed on 709 older adults who wore an ActiGraph GT3X+ accelerometer on the hip, with ground truth sit/stand labels derived from concurrently worn thigh-worn activPAL inclinometers for up to 7 days. The CHAP method was compared to traditional cut-point methods of sitting pattern classification as well as a previous machine learned algorithm (Two Level Behavior Classification [TLBC]). Results For minute level sitting vs. non-sitting classification, CHAP performed better (93% agreement with activPAL) than other methods (74%-83% agreement). CHAP also outperformed other methods in its sensitivity to detecting sit-to-stand transitions: cut-point (73%), TLBC (26%), and CHAP (83%). CHAP's positive predictive value of capturing sit-to-stand transitions was also superior to other methods: cut-point (30%), TLBC (71%), and CHAP (83%). Day-level sitting pattern metrics, such as mean sitting bout duration, derived from CHAP did not differ significantly from activPAL, whereas other methods did: activPAL (15.4 mins mean sitting bout duration), CHAP (15.7 mins), cut-point (9.4 mins), TLBC (49.4 mins). Conclusion CHAP was the most accurate method for classifying sit-to-stand transitions and sitting patterns from free-living hip-worn accelerometer data in older adults. This promotes enhanced analysis of older adult movement data, resulting in more accurate measures of sitting patterns and opening the door for large scale cohort studies into the effects of sitting patterns on healthy aging outcomes.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    44
    References
    0
    Citations
    NaN
    KQI
    []