Cloud computing is a bigger business than most people realize. Its influence is not only felt in the business world, where it has displaced former local servers with smoother alternatives outside the website. Even the layman mixes pieces for the ethereal data center in the sky thanks to services like Google Photos and Netflix. But is another revolution going on?
We̵7;re talking about edge computing. This new paradigm in IT aims to bring remote data centers closer to the people who actually use them. It is especially perfect for time-critical applications where low latency is a must. Here’s what you need to know.
In the beginning there was the server
Because edge computing makes sense, it’s good to put it in a historical context, so we start from the beginning.
Corporate IT used to be a static business. People worked in large cupboards and tore during the harsh glare of the halogen light. It made sense that their data and business-critical applications were nearby. The companies would shoot servers in well-ventilated rooms on site or they would rent space in a local data center.
Then things changed. People started working from home more. Companies grew and opened offices in other cities and countries. The local server quickly ceased to be meaningful – especially considering the huge growth in consumer Internet use. It is difficult for technology companies to scale when they are forced to buy, provide and distribute new servers every other day.
Cloud computing services, such as Microsoft Azure and Amazon Web Services (AWS), solved these problems. Companies could rent space on a server and expand as they grew.
The problem with the cloud in its current incarnation is that it is centralized. Providers like Amazon, Microsoft and Google have data centers in most places, but these are often hundreds – if not thousands of miles away from their customers.
For example, if you are in Edinburgh, Scotland, your nearest AWS data center is in London, which is about 50 miles away. Meanwhile, if you are in Lagos, Nigeria, your nearest continental AWS location is in Cape Town, South Africa, which is almost 3 000 km away.
The longer the distance, the higher the latency. Remember that data only flows light through a fiber optic cable and is therefore limited by the laws of physics.
So, what is the solution? Well, without a doubt, the answer lies in the fact that history repeats itself, and in bringing the servers closer to the people who use them.
Life on the edge
In summary, edge computing means bringing applications and data storage closer to where the people who use them are. For large companies, this may include a custom-built server facility near their headquarters. On the consumer front, it may be a good idea to consider IoT devices that perform certain tasks, such as face recognition, with their own local computing resources, rather than cultivating it into a cloud service.
There are some benefits to this. First, it helps reduce the amount of network traffic that needs to be sent. Considering that many large companies often pay steep fees to shuffle pieces between data centers, it makes sense to bring them closer to home.
Second, it reduces latency. Often a large part of the time required to perform a task of moving traffic over the network is devoted. Bringing computing power closer to home can reduce that latency and speed things up.
This could potentially open the door to new computers, for which immediacy is the key. An often appreciated example is a “smart city”, where the local government can collect information on things like use and road traffic patterns in real time and then take quick action.
There are also potential uses for edge computing in the industrial sector. These include allowing manufacturers to collect data about the equipment and make quick adjustments, thereby reducing energy use and degradation of the equipment.
On the consumer side, edge computing has the potential to make things like cloud gaming a more satisfying experience. If graphic number hugging is closer to the players, they are less likely to experience unpleasant delay, which can be the deciding factor for who wins an online game.
The 5G factor
Simultaneously with the steady increase in edge computing is the introduction of 5G connectivity. Although still in its infancy, 5G promises significantly lower latencies than previous mobile standards. As a result, you can expect it to play a major role in the development of edge computing as a paradigm.
What does this mean? In the logistics sectors, you see a greater emphasis on analysis and data, as trucks and vans transmit information that must be analyzed and remedied in real time. There are also prospects for “smart agriculture”, which will automate large parts of agricultural production. This will not only improve yields, but it will also prevent waste.
Then there is the consumer side. By bringing the “heavy lift” closer to people’s phones, you unlock newer, more immersive entertainment experiences for things like virtual reality (VR), augmented reality (AR) and games.
Of course, it is still a long way off. Carriers and developers must build it first. But when they do, you can expect the same seismic change that occurred when cloud computing first exploded on stage.
RELATED: What is 5G and how fast will it be?