Multilayered neural architectures evolution for computing sequences of orthogonal polynomials

2018 
This article presents an evolutionary algorithm to autonomously construct full-connected multilayered feedforward neural architectures. This algorithm employs grammar-guided genetic programming with a context-free grammar that has been specifically designed to satisfy three important restrictions. First, the sentences that belong to the language produced by the grammar only encode all valid neural architectures. Second, full-connected feedforward neural architectures of any size can be generated. Third, smaller-sized neural architectures are favored to avoid overfitting. The proposed evolutionary neural architectures construction system is applied to compute the terms of the two sequences that define the three-term recurrence relation associated with a sequence of orthogonal polynomials. This application imposes an important constraint: training datasets are always very small. Therefore, an adequate sized neural architecture has to be evolved to achieve satisfactory results, which are presented in terms of accuracy and size of the evolved neural architectures, and convergence speed of the evolutionary process.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    58
    References
    1
    Citations
    NaN
    KQI
    []