Adaptive priority-based cache replacement and prediction-based cache prefetching in edge computing environment

2020 
Abstract With the proliferation of smartphones and tablets, large amounts of data are generated at the edge of the network. Edge computing becomes a promising paradigm for the processing of large amounts of data due to its characteristics of low latency, high data throughput, and low traffic pressure. However, the increasing requirements of the novel terminal applications and services on timely content delivery. Further reducing the latency and improving the data service quality are still the challenges for the data processing. Caching strategy is an effective solution to address these issues. In order to better save the cache space of edge nodes to cache more high heat files, thereby to improve the quality of user data services, a cache replacement strategy based on priority and LRU is proposed. In this strategy, the files to be replaced are selected according to the LRU principle in each priority queue. Then the re-access weight of these files is calculated. The file with the smallest re-access weight is selected to finally be replaced. To further improve the quality of data services and reduce user access latency, a cache prefetching strategy based on Bayesian network theory is presented. This strategy selects the files to be prefetched based on the Bayesian network, and then selects edge nodes with lower loads to place these files. The proposed strategies are evaluated in an edge computing environment built over a campus network. Extensive experimental results show that the proposed cache replacement strategy outperforms the benchmarks in terms of cache hit rate, delay saving rate and cost saving rate. The proposed cache prefetching strategy performs better than the benchmarks in terms of prefetching hit rate and memory load consuming.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    27
    References
    10
    Citations
    NaN
    KQI
    []