Unravelling the Edge – definitions and more
Definition the key to utilizing edge computing

Unravelling the Edge – definitions and more

For the past five years, the primary impact of shifting IT platforms was to drive greater datacenter concentration and the creation of a core IT portfolio. It included the consolidation of smaller corporate datacenters (often inherited during acquisitions) into larger facilities that leverage technologies such as SSD storage, converged systems, and software-defined infrastructure to boost agility and operational efficiency. It also included greater use of mega datacenters owned by major colocation, managed services, and SaaS/IaaS cloud datacenter operators to enable faster creation and scale-up of new digital services.

This concentration at the core enables faster, more capable modernization of critical business systems of record needed to support and provide a layer of "trust" for new business initiatives.

In the meantime, the extension of the core to the edge enables rapid, lower-risk access to critical compute, data, and network resources needed to develop fast-evolving and highly scalable mobile engagement and analytical services.

The question remains as workloads collapse to the core and demands for applications to come closer to the machines and people who consume them – what exactly is the edge?

A series of industry and vendor bodies and proposed definitions. However, recent IDC research shows that the majority of enterprises remain uncertain as the what the edge might be.

The Edge Computing Consortium (ECC) proposes that edge computing is performed on an open platform at the network edge near things or data sources, integrating network, computing, storage, and application core capabilities and providing edge intelligent services.

The Industrial Internet Consortium proposes that edge computing comprises all computation, storage, communications, and processing associated with collecting, transforming and acting upon information captured from the Edge, or transmitted to the Edge. Lastly, EdgeX Foundry is a vendor-neutral open source project hosted by The Linux Foundation building a common open framework for IoT edge computing. EdgeX proposes that that the edge is an IoT architecture allowing customers to deploy a mix of plug–and–play microservices on compute nodes at the edge.

Each of these defines edge computing by including reference to the edge – a circular argument at best. In combination, they describe the attributes of the edge – the need for compute, storage and applications, the role of intelligence and analytics, the urge for a microservice oriented, container deployed, open source (and open interface) model. None, however, goes beyond the definition of the Edge as anything more than proximity to consumers of compute and providers of data.

The proposed answer to ‘Where is the edge’ is one which acknowledges all the component driven definitions above. It goes one step further, though, by suggesting that the Edge itself is the domain between an IoT endpoint and its nearest connected compute resource where the network transit time between those two points is less than two milliseconds. Such a proposal supports an approach allowing for an understanding of where to place compute nodes to maximise value to the endpoints they support.

Two milliseconds of network latency offers a good compromise to distance over performance and can be easily understood by readily available network monitoring and application performance tools. On a perfect network, light will travel about 600km in 2 ms. Allocating 70% efficiency, this practical distance is reduced to around 470km.

Combining all these definitions and proposals together provides an approach to edging for the modern day – apply open source platforms with rich APIs, provide apps and analytics, design and deploy devices and gateways to the 2ms network latency rule and carefully choose which workloads are core and those to edge. 

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics