Applying a fusion of wearable sensors and a cognitive inspired architecture to real-time ergonomics analysis of manual assembly tasks

2021 
Abstract High value manufacturing systems still require ergonomically intensive manual activities. Examples include the aerospace industry where the fitting of pipes and wiring into confined spaces in aircraft wings is still a manual operation. In these environments, workers are subjected to ergonomically awkward forces and postures for long periods of time. This leads to musculoskeletal injuries that severely limit the output of a shopfloor leading to loss of productivity. The use of tools such as wearable sensors could provide a way to track the ergonomics of workers in real time. However, an information processing architecture is required in order to ensure that data is processed in real time and in a manner that meaningful action points are retrieved for use by workers. In this work, based on the Adaptive Control of Thought—Rational (ACT-R) cognitive framework, we propose a Cognitive Architecture for Wearable Sensors (CAWES); a wearable sensor system and cognitive architecture that is capable of taking data streams from multiple wearable sensors on a worker’s body and fusing them to enable digitisation, tracking and analysis of human ergonomics in real time on a shopfloor. Furthermore, through tactile feedback, the architecture is able to inform workers in real time when ergonomics rules are broken. The architecture is validated through the use of an aerospace case study undertaken in laboratory conditions. The results from the validation are encouraging and in the future, further tests will be performed in an actual working environment.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    0
    Citations
    NaN
    KQI
    []