Green MEC Networks Design under UAV Attack: A Deep Reinforcement Learning Approach

2021 
In this paper, we propose a novel optimization framework for a secure and green mobile edge computing (MEC) network, through a deep reinforcement learning approach, where the secure data transmission is threatened by the unmanned aerial vehicle (UAV). To alleviate the local burden on the computation, some computational tasks can be offloaded to the computational access points (CAPs), at the cost of price, transmission latency and energy consumption. By jointly reducing the price, latency and energy consumption, we propose a novel optimization framework for the secure MEC network, based on the deep reinforcement learning. Specifically, we firstly employ several optimization criteria, where criterion I minimizes the linear combination of price, latency and energy consumption, criterion II minimizes the price with the constrained latency and energy consumption, criterion III minimizes the latency with the constrained price and energy consumption, while criterion IV minimizes the energy consumption with the constrained price and latency. For each criterion, we then propose an optimization framework which can dynamically adjust the task offloading ratio and bandwidth allocation ratio simultaneously, where a novel feature extraction network is proposed to improve the training effect. Simulation results are finally demonstrated to verify the effective of the proposed optimization framework.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    39
    References
    0
    Citations
    NaN
    KQI
    []