Design and energy-efficient resource management of virtualized networked Fog architectures for the real-time support of IoT applications

2018 
With the incoming 5G access networks, it is forecasted that Fog computing (FC) and Internet of Things (IoT) will converge onto the Fog-of-IoT paradigm. Since the FC paradigm spreads, by design, networking and computing resources over the wireless access network, it would enable the support of computing-intensive and delay-sensitive streaming applications under the energy-limited wireless IoT realm. Motivated by this consideration, the goal of this paper is threefold. First, it provides a motivating study the main “killer” application areas envisioned for the considered Fog-of-IoT paradigm. Second, it presents the design of a CoNtainer-based virtualized networked computing architecture. The proposed architecture operates at the Middleware layer and exploits the native capability of the Container Engines, so as to allow the dynamic real-time scaling of the available computing-plus-networking virtualized resources. Third, the paper presents a low-complexity penalty-aware bin packing-type heuristic for the dynamic management of the resulting virtualized computing-plus-networking resources. The proposed heuristic pursues the joint minimization of the networking-plus-computing energy by adaptively scaling up/down the processing speeds of the virtual processors and transport throughputs of the instantiated TCP/IP virtual connections, while guaranteeing hard (i.e., deterministic) upper bounds on the per-task computing-plus-networking delays. Finally, the actual energy performance-versus-implementation complexity trade-off of the proposed resource manager is numerically tested under both wireless static and mobile Fog-of-IoT scenarios and comparisons against the corresponding performances of some state-of-the-art benchmark resource managers and device-to-device edge computing platforms are also carried out.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    58
    References
    45
    Citations
    NaN
    KQI
    []