Member-only story

The Quick Guide to Understanding Edge Computing

Some Dude Says
7 min readDec 20, 2020

--

Image by Free-Photos from Pixabay

Edge computing is one of the biggest paradigm shifts for cloud computing in recent years. The concept of edge computing boils down to reducing “distance” between devices by moving them closer to the “edge” of their networks. This term can be confusing because the “edge” really doesn’t have a solid definition. The overall goal is to reduce long-distance communication between devices so that latency is reduced and the process is more efficient. You pull out the easy pieces which can be done on the hardware available on the edge. Edge computing can save time and money for the cloud service and the people using it.

Distributed computing is not a new paradigm, but the application of edge computing (arguably) is. The general principles work out the same, but the philosophy behind them is different. Multi-host redundancy, hybrid environments, on-premise services, etc. are all examples of distributed computing, but the intention matters to determine which end up as edge computing and which don’t.

While all edge computing is distributed computing, not all distributed computing is edge computing. The term is fuzzy if you don’t understand the intent. Let’s break down how to figure out what defines edge computing.

Defining Edge Computing

--

--

Some Dude Says
Some Dude Says

Written by Some Dude Says

I write about technology, linguistics (mainly Chinese), and anything else that interests me. Check out https://somedudesays.com for more from me!

No responses yet