Home Computer Edge Computing Definition

Edge Computing Definition


What is edge computing? 

Edge computing refers to a distributed IT architecture that processes client data at the edge of the network as close as possible to the source.

Data is vital to modern businesses. It provides valuable business insights and supports real-time control of critical business processes. Businesses today are drowning in data. Huge amounts of data can be collected routinely from sensors and IoT devices operating in remote locations and inhospitable environments around the globe.

This virtual flood of data is changing how businesses use computing. Traditional computing models that rely on the internet and a central data center are not well-suited for moving ever-increasing amounts of real-world data. These efforts can be hindered by bandwidth limitations, latency issues, and unpredictable network disruptions. These data challenges are being addressed by businesses using edge computing architecture.

Edge computing is simply a way to move some storage and compute resources closer to the source of data. Instead of sending raw data to a central center for processing and analysis, it is done where the data is generated. This could be in a retail store or factory floor, or even across a smart community. Only the results of computing at the edge, such as real-time business insight, equipment maintenance predictions, or other actionable answers are sent back to the main center for review and human interaction.

Edge computing is changing IT and business computing. Take a detailed look at the definition of edge computing, its impact, and tradeoffs.

 Edge computing brings data processing closer to the data source.

What is edge computing?

It is all about location. Traditional enterprise computing uses data produced at the client endpoint (e.g. a user’s laptop). The data is then moved over a WAN, such as internet, via the corporate LAN where it is stored and processed by an enterprise app. The client is then informed of the results. This is a tried and true method of client-server computing that works well for most business applications.

The number of connected devices, as well as the volume of data produced by these devices, is increasing far faster than traditional data center infrastructures can handle. Gartner forecasted that 75% of enterprise-generated information will be generated outside of centralized data centers by 2025. It is a huge burden on the global internet that so much data must be moved in such a way that it can cause disruptions or delays.

IT architects have moved their focus away from the central data center to the logical edges of infrastructure. This involves taking computing and storage resources from the data center and moving them to the place where the data is generated. It’s simple: If the data is not available close to the center, move the center closer to it. Edge computing isn’t a new concept. It is rooted in decades-old remote computing ideas, such as branch offices and remote offices. This was where it was more reliable to have computing resources placed at the desired location than relying on one central location.

 Although only 27% of respondents have already implemented edge computing technologies, 54% find the idea interesting.

Edge computing places storage and servers where data is. Often, it requires little more than a rack of gear to run on remote LANs to collect and process data locally. Many times, the computing gear is placed in hardened or shielded enclosures to protect it from extremes of temperature, humidity, and other environmental conditions. The process of processing often involves normalizing the data stream and analyzing it to find business intelligence. Only the results are sent back from the principal data center.

Business intelligence can have many different meanings. Examples include retail environments, where video surveillance can be combined with sales data to determine the best product configurations or consumer demand. Predictive analytics can be used to guide maintenance and repair of equipment before any actual failures or defects occur. Other examples often work in conjunction with utilities such as water treatment and electricity generation to ensure equipment’s proper functioning and maintain quality output.

Edge vs. Cloud vs. Fog Computing

Edge computing is closely related to cloud computing concepts and fog computing. These concepts may have some overlap, but they are not the same thing and should not be used interchangeably. It is helpful to understand the differences between these concepts and to compare them.

It is easy to see the differences in edge, cloud, and fog computing by highlighting their common theme. All three concepts are related to distributed computing. They focus on the physical deployment and storage of computing and storage resources relative to the data being produced. It is all about where these resources are located.

Previous articleWhy Does my Computer Keep Disconnecting From wifi?
Next articleLearn Computer Programming


Please enter your comment!
Please enter your name here