Improving Artificial Neural Network Based Streamflow Forecasting Models through Data Preprocessing

2021 
The real time hydrological data may contain noise, missing information and deviation from its original scale due to complex and nonlinear nature of hydrological processes. The data when used as it is in hydrological forecasting may create uncertainty in hydrological models, especially in data-driven models which fully rely upon the input-output data. The current research provides a simple preprocessing approach to improve the performance of ANN-based streamflow estimation models through providing a better input state. The two-step preprocessing approach includes; the data transformation through a family of power transformation, the Box-Cox transformation, and the selection of appropriate input variables through the Gamma Test. The original data, which is essentially antecedent upland catchment information of thirteen stations located in Upper Indus Basin (UIB), comprises of twenty inputs, including precipitation, solar radiation and discharge. The Box-Cox transformation has been applied to prepare a transformed data-set and the power factor, λ, (with best value of 0.005), for this transformation, has been determined using probability plots and histogram characteristics. Input combination selection procedure is carried out in WinGamma environment with the help of Genetic Algorithm (GA). Two-layer ANN models have been trained through Broyden, Fletcher and Goldfrab Shano (BFGS) training algorithm for both original and transformed data-sets. The comparison of models clearly indicate that the models developed through transformed data-set showed better performance in both training and testing phases with high values of NSE and R2 which is above 90% in most of the cases, and less other statistical errors including RMSE, VARIANCE and BIAS. Simple preprocessing options, could significantly reduce the uncertainty in ANN based hydrological models through improving the quality of real time hydrological data.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    64
    References
    1
    Citations
    NaN
    KQI
    []