Home Technology Edge computing can ease the burden for data centers

Edge computing can ease the burden for data centers

by Jason Deign

• Edge computing is like the central nervous system. Learn how it can help data centers focus on more complex tasks.

Humans are smart. Given enough time, we can work out how to get to the moon, climb the highest mountain, and dive the deepest ocean trench. But all those brains don’t mean a thing if you put your hand in a flame by mistake. 

That’s where your reflex actions come in to save you from harm. Your nerves sense your flesh is burning and act to jerk your hand away before your brain can even process what is happening. It’s just as well because If your brain had to deal with each tiny action like this, you would face a mental logjam. 

And that is what could happen to data centers without the help of edge computing. Until now, data centers have grown to keep pace with greater and greater number-crunching needs. But recent trends are forcing things to the limit. 

The cloud is set to create a threefold boost in global data center traffic by 2021. Mobile data will top 930 exabytes by 2022. And by that year there will be 28.5 billion endpoints to networks worldwide. 

Trying to manage all this using data center-based servers, as is the case right now, will not only put a strain on the data centers but also on the networks that link them to the network edge. 

Instead, networks need to become more like the human nervous system, taking care of basic functions at the edge and leaving the brains—or data centers—to carry out more complex work. That’s what edge computing is all about. 

Enterprise deployments generate a ton of data, It can save time, bandwidth, and costs if local edge devices act as data filters using protocols to restrict what data is sent to the data center.“Enterprise deployments generate a ton of data,” says Joshua Taubenheim, technology analyst at MachNation. “It can save time, bandwidth, and costs if local edge devices act as data filters using protocols to restrict what data is sent to the data center.”

Dealing with data at the edge can help make networks faster and less prone to faults, Taubenheim says. “A low-latency design allows data to be processed and consumed more quickly than a traditional cloud architecture,” he notes. 

Edge computing not only carries the promise of freeing up networks and data centers but could also support new use cases, like self-driving cars, which need to react to potential crashes within fractions of a second. That does not leave time for a data center to get involved. 

But while self-driving cars have yet to hit the road in earnest, other ways to deal with data are already heading out to the edge. Companies that use the Internet of Things (IoT), for instance, often use edge computing to deal with the massive amounts of data that sensors provide. 

And a firm called Bluzelle is even aiming to apply the edge computing concept to databases. Its plan is to spread each database across cloud-based servers around the world so that queries can be handled much more quickly than by using one or two remote data centers. 

Chief Executive Officer Pavel Bains expects the concept to be a hit with firms that require applications needing high performance for users spread over multiple regions, such as IoT, games, and blockchains. 

“If your data is sitting in a data center in London and you have a game that takes off in Mumbai then you have to spin up another database and catch up,” he explains. “If you have edge-based databases, you have servers right there.” 

Used with the permission of http://thenetwork.cisco.com