Machine Learning based calibration time reduction for Gas Sensors in Temperature Cycled Operation

2021 
This paper shows the opportunities of data preprocessing and how it influences the time required to record a sufficient amount of valid calibration data samples. Specifically, we approach the minimum needed time for calibration from two sides: on the one hand, repetitions are omitted for training one by another to define the lowest number of valid data that is needed for a model to achieve a reasonable accuracy. On the other hand we add samples, that are labeled as valid data points by steady-state detection to the dataset compared to a time-consuming manual annotation. The results will be demonstrated on a dataset of a metal oxide semiconductor gas sensor in temperature cycled operation measuring mixtures of artificial room air containing several volatile organic compounds and quantifying formaldehyde which is carcinogenic and therefore of high concern in indoor environments. The dataset is generated with an automated gas mixing system and then optimized with the help of data pre-processing methods based on steady-state detection, outlier detection and ResNet neural networks. The dataset can be reduced to only 50 % of the original data and is still able to train an artificial neural network with a root mean square error smaller than 25 % compared to the guideline value for formaldehyde concentration defined by the WHO.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    1
    Citations
    NaN
    KQI
    []