language-icon Old Web
English
Sign In

Edge computing

Edge computing is a distributed computing paradigm which brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. Edge computing is a distributed computing paradigm which brings computation and data storage closer to the location where it is needed, to improve response times and save bandwidth. The increase of IoT devices at the edge of the network is producing a massive amount of data to be computed to data centers, pushing network bandwidth requirements to the limit. Despite the improvements of network technology, data centers cannot guarantee acceptable transfer rates and response times, which could be a critical requirement for many applications. Furthermore devices at the edge constantly consume data coming from the cloud, forcing companies to build content delivery networks to decentralize data and service provisioning, leveraging physical proximity to the end user. In a similar way, the aim of Edge Computing is to move the computation away from data centers towards the edge of the network, exploiting smart objects, mobile phones or network gateways to perform tasks and provide services on behalf of the cloud. By moving services at the edge, it is possible to provide content caching, service delivery, storage and IoT management, resulting in better response times and transfer rates. At the same time, distributing the logic in different network nodes introduces new issues and challenges. The distributed nature of this paradigm introduces a shift in security schemes used in cloud computing. Not only data should be encrypted, but different encryption mechanism should be adopted, since data may transit between different distributed nodes connected through the internet before eventually reaching the cloud. Edge nodes may also be resource constrained devices, limiting the choice in terms of security methods. Moreover a shift from centralized top-down infrastructure to a decentralized trust model is required.On the other hand by keeping data at the edge it is possible to shift ownership of collected data from service providers to end-users. IoT solutions are currently the perfect target for hackers, but edge computing can help to secure the networks by increasing data Security. Scalability in a distributed network must face different issues. First, it must take into account the heterogeneity of the devices, having different performance and energy constraints, the highly dynamic condition and the reliability of the connections, compared to more robust infrastructure of cloud data centers. Moreover, security requirements introduce further latency in the communication between nodes, which may slow down the scaling process. Management of failovers is crucial in order to maintain a service alive. If a single node goes down and is unreachable, users should still be able to access a service without interruptions. Moreover, edge computing systems must provide actions to recover from a failure and alerting the user about the incident. To this aim, each device must maintain the network topology of the entire distributed system, so that detection of errors and recovery become easily applicable. Other factors that may influence this aspect are the connection technology in use, which may provide different levels of reliability, and the accuracy of the data produced at the edge that could be unreliable due to particular environment conditions. Edge application services reduce the volumes of data that must be moved, the consequent traffic, and the distance that data must travel. That provides lower latency and reduces transmission costs. Computation offloading for real-time applications, such as facial recognition algorithms, showed considerable improvements in response times as demonstrated in early research. Further research showed that using resource rich machines near mobile users, called cloudlets, offering services typically found in the cloud, provided improvements in execution time when some of the tasks are offloaded to the edge node.. On the other hand, offloading every task may result in a slowdown due to transfer times between device and nodes, so depending on the workload an optimal configuration can be defined. Another vision of the architecture is to give a re-birth for cloud gaming where the game simulations are run in the cloud and the rendered video is transferred to lightweight clients such as mobile, VR glasses, etc. Such type of streaming is also known as pixel streaming. Conventional cloud games may suffer from high latency and insufficient bandwidth, since the amount of data transferred is huge due to the high resolutions required by some services. As real-time games such as FPS (First person Shooting) games have strict constraints on latency, processing game simulation at the edge node is necessary for the immersive game plays. Edge nodes when used for game streaming, are known as Gamelets, which are usually one or two hops away from the client. Other notable applications include connected, autonomous cars., smart cities and home automation systems. Alex Reznik, Chair of the ETSI MEC ISG standards committee, has a broad definition, 'anything that’s not a traditional data center could be the ‘edge’ to somebody.' Other definitions are more limited. The State of the Edge report concentrates on servers 'in close proximity to the last mile network.' Gamelet paper defines 'the edge node is mostly one or two hops away from the mobile client to meet the response time constraints for real-time games'. It also states, 'Gamelet system is basically a distributed micro-cloud system in which the computationally intensive tasks such as game simulation and rendering are offloaded to a Gamelet node that is few hops (most of the time one or two hops) away from the mobile client'. Philip Laidler believes 'edge compute includes workloads running on customer premises.' Some call this the customer, enterprise or device edge. Another, more inclusive way to define 'edge computing' is to include any type of computer program delivers low latency nearer to the requests. Karim Arabi, in an IEEE DAC 2014 Keynote and subsequently in an invited talk at MIT's MTL Seminar in 2015 defined Edge Computing broadly as all computing outside Cloud happening at the Edge of the network and more specifically in applications where real-time processing of data is required. In his definition, Cloud Computing operates on “Big Data” while Edge Computing operates on “Instant Data” that is real-time data generated by sensors or users.

[ "Cloud computing", "Internet of Things", "Server", "Architecture", "Enhanced Data Rates for GSM Evolution", "cloud computing internet of things", "edge analytics", "Computation offloading", "edge server" ]
Parent Topic
Child Topic
    No Parent Topic