Stress Prediction Using Machine Learning and IoT

2022 
Stress is a mental condition that affects every aspect of life leading to sleep deprivation and various other diseases. Thus, it's necessary to analyse one's vitals to stay updated about one's mental health. This paper presents an effective method for detecting cognitive stress levels using data from a physical activity tracker device. The main goal of this system is to use sensor technology to detect stress using a machine learning approach. Individually, the impact of each stressor is assessed using ML models, followed by the construction of a NN model and assessment using ordinal logistic regression models such as logit, probit and complementary log-log. The paper uses heartbeat rate as one of the features to recognise stress and the Internet of Things (IoT) and Machine Learning (ML) to alert the situation when the person is in real danger. The patient's condition is predicted using machine learning and the patient's acute stress condition is relayed using IoT. Based on the heartbeat, a prediction of whether a person is under stress or not can be made. The paper presents a model that can predict stress levels based entirely on electrocardiogram (ECG) data, which can be measured with consumer-grade heart monitors The ECG's spectral power components, as well as time and frequency domain features of heart rate variability, are included in the model. The stress detector system takes the real-time data from the IoT device (sensor), then applies a machine learning model on the data to detect stress levels in an individual and eventually informs/alerts the individual about their stress condition. By collecting data, this system is tested and evaluated in a real-time environment with different machine learning models. Finally, a comprehensive comparative analysis has been depicted amongst the applied models with Random Forest Classifier showing the highest accuracy. The novelty of this work lies in the fact that a stress detection framework should be as unobtrusive to the user as possible.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    8
    References
    0
    Citations
    NaN
    KQI
    []