A Game Based Power Allocation in Cloud Computing Data Centers

2018 
The emergence of smart systems based on Internet of Things (IoT) and new technologies has led to use more cloud computing services. This incentivizes to build more geographically distributed data centers. However, the data centers consume a tremendous amount of electricity which significantly increases load on power grid. There are broad concerns about the impact that this huge consumption may cause to the power grid. Moreover, the data centers are competing to get the maximum of power from the smart grid in a selfish way, which also has a negative impact on both the smart grid and the other data centers. In this paper, we model the power allocation problem between the smart grid and cloud data centers as a non-cooperative game. The basic idea of our approach is to determine the optimal quantity of power that will be assigned to each data center, in order to have a fair power allocation. To do so, we consider the data center priority in terms of number of active servers, state of energy charge and number of running critical applications. Moreover, we prove the existence and uniqueness of Nash equilibrium, and compute the optimal quantity of power using Lagrange multipliers and KarushKuhnTucker (KKT) conditions. Simulation results confirm the effectiveness of the proposed approach, and show that our scheme can reduce the load on the power grid up to 80%.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    16
    References
    2
    Citations
    NaN
    KQI
    []