Identification of informational and probabilistic independence by adaptive thresholding

2022 
The independence assumptions help Bayesian network classifier (BNC), e.g., Naive Bayes (NB), reduce structure complexity and perform surprisingly well in many real-world applications. Semi-naive Bayesian techniques seek to improve the classification performance by relaxing the attribute independence assumption. However, the study of dependence rather than independence has received more attention during the past decade and the validity of independence assumptions needs to be further explored. In this paper, a novel learning technique, called Adaptive Independence Thresholding (AIT), is proposed to automatically identify the informational independence and probabilistic independence. AIT can respectively tune the network topologies of BNC learned from training data and testing instance under the framework of target learning. Zero-one loss, bias, variance and conditional log likelihood are introduced to compare the classification performance in the experimental study. The extensive experimental results on a collection of 36 benchmark datasets from the UCI machine learning repository show that AIT is more effective than other learning techniques (such as structure extension, attribute weighting) and helps make the final BNCs achieve remarkable classification improvements.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    0
    References
    0
    Citations
    NaN
    KQI
    []