Edge vs. cloud? And How to Determine When to Use Both.
Both cloud and edge computing have their advantages and challenges
Talk of the cloud has dominated the tech industry for years – not to mention spurred the success of numerous cloud-based products and behemoth cloud providers. In fact, Gartner projected the public cloud services market to be worth $246.8 billion by the end of last year, with cloud adoption strategies anticipated to influence more than half of the IT deals through 2020.
But technology advancement rarely happens in a linear fashion.
While cloud adoption remains a critical focus for many organizations, a new era of connected devices is simultaneously transferring data collection and computing power to the edge of networks.
Both cloud and edge computing have their advantages and challenges. The next hurdle for IT teams? Determining how to get the best of both.
The edge rises
It's tough to talk about edge technologies without mentioning the Internet of Things (IoT).
In another study, Gartner estimates that there are 8.4 billion connected devices currently in use around the world. Spending on those endpoints exceeds $2 trillion. Those devices are generating a huge amount of data and attempting to send it all back to a centralized cloud. With the ever-growing amount of data, congestion and latency issues become all but inevitable.
While cloud adoption remains a critical focus for many organizations, a new era of connected devices is simultaneously transferring data collection and computing power to the edge of networks.Share this quote
Edge technologies offer a solution by pushing that data processing to the edges of a network. This is typically done via applications or nearby computing devices. In scenarios in which response time is paramount, edge computing is especially useful. For instance, devices that need to immediately react to sensor information such as a smart stoplight or factory fire alarm.
For many enterprises, the cloud still offers many necessary benefits. Centralized application and data management is efficient and often more affordable. In addition, distributed applications are inherently more complex to build, deploy and support without the cloud.
Not one or the other
Many industry experts are pushing back on the notion that cloud and edge computing are in competition with each other.
Instead, forward-looking organizations and even many public cloud service providers are beginning to consider how to selectively employ some of both. Amazon Web Services announced the debut of its Amazon Greengrass software for edge gateways and appliances in late 2016.
Small-scale data centers offer another approach. By deploying these data centers in strategic geographic locations, companies can move data processing closer to the end-user or device. Doing so provides similar benefits as edge computing, while still maintaining the centralized management benefits that enterprises love about the cloud.
The strategy is certainly gaining momentum. Sales of so-called micro-modular data centers (MMDCs) may reach nearly $30 million this year, up from $18 million in 2017, according to 451 Research. The report notes that while the overall spend may seem small, MMDCs are playing a significant role in thousands of expensive projects aimed at localizing computer processing power.
For enterprises, the data deluge will continue. Going forward, edge technologies will often be part of the solution stack for organizations overwhelmed by their computing needs – but likely not the only answer.
Edge computing and processing aren’t new concepts, so why are we talking about the edge?