Nonparametric kernel smoother on topology learning neural networks for incremental and ensemble regression

2017 
Incremental learning is a technique which is effective to increase the space efficiency of machine learning algorithms. Ensemble learning can combine different algorithms to form more accurate ones. The parameter selection of incremental methods is difficult because no retraining is allowed, and the combination of incremental and ensemble learning has not been fully explored. In this paper, we propose a parameter-free regression framework and it combines incremental learning and ensemble learning. First, the topology learning neural networks such as growing neural gas (GNG) and self-organizing incremental neural network (SOINN) are employed as solutions to nonlinearity. Then, the vector quantizations of GNG and SOINN are transformed into a feed-forward neural network by an improved Nadaraya–Watson estimator. A maximum likelihood process is devised for adaptive parameter selection of the estimator. Finally, a weighted training strategy is incorporated to enable the topology learning regressors for ensemble learning by AdaBoost. Experiments are carried out on 5 UCI datasets, and an application study of short-term traffic flow prediction is given. The results show that the proposed method gives comparable results to mainstream incremental and non-incremental regression methods, and better performances in the short-term traffic flow prediction.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    35
    References
    5
    Citations
    NaN
    KQI
    []