On dimensionality reduction via partial least squares for Kriging-based reliability analysis with active learning

2021 
Abstract Kriging with active learning has been widely employed to calculate the failure probability of a problem with random inputs. Training a Kriging model for a high-dimensional problem is computationally expensive. This reduces the efficiency of active learning strategies since the training cost becomes comparable to that of the function evaluation itself. Kriging with partial least squares (KPLS) has the advantage of a fast training time but its efficacy on high-dimensional reliability analysis has not yet been properly investigated. In this paper, we assess the potential benefits of KPLS for solving high-dimensional reliability analysis problems. This research aims to identify the potential advantages of KPLS and characterize the problem domain where KPLS can be most efficient and accurate in estimating the failure probability. Tests on a set of benchmark problems with various dimensionalities reveal that KPLS with four principal components significantly reduces the CPU time compared to the ordinary Kriging while still achieving accurate failure probability. In some problems, it is also observed that KPLS with four principal components reduces the number of function evaluations, which will be beneficial for problems with expensive function evaluations. Using too few principal components, however, does not show any evident improvements over ordinary Kriging.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    46
    References
    0
    Citations
    NaN
    KQI
    []