Edge computing: An emerging trend in IoT
A fog server and computing (also known as edge computing) will become a major driver of the Internet of Things (IoT) in the coming years. With an ever increasing number of connected devices generating an unprecedented amount of data, connecting everything to a central cloud is becoming more complicated.
Today’s cloud models are not designed for the volume, variety, and velocity of data that the IoT generates. Billions of previously unconnected devices are generating more than two exabytes of data each day. An estimated 50 billion “things” will be connected to the Internet by 2020. Moving all that data from these things to the cloud for analysis would require vast amounts of bandwidth. Network congestion due to the increasing number of connected devices increases network latency. Often, by the time the data makes its way to the cloud for processing, the opportunity to act on it is gone.
Capitalizing on the IoT requires a new kind of infrastructure. Reducing latency, the time it takes to process data, is critical to harnessing the full potential of IoT. Until now, latency reduction efforts have focused primarily on upgrading infrastructure to increase network capacity. However, the time and cost implications of these projects can be formidable. Fog computing solves this problem by distributing the load over many small-scale data centers instead of a single large facility.
The case for edge computing | Fog server cooling
Edge or fog computing pushes the applications, data and computing power away from centralized computing nodes to the edge of the network. In addition to providing sub-second response to end users, edge computing provides high levels of scalability, reliability and fault tolerance. The advantages of edge computing include;
- Significantly less data movement across the network. This results in reduced congestion, cost and latency and elimination of bottlenecks resulting from centralized computing systems
- Improved security of encrypted data as it stays closer to the end user reducing exposure to hostile elements
- Improved scalability arising from virtualized systems.
Edge computing brings bandwidth-intensive content and latency-sensitive applications closer to the user or data source. The most time-sensitive data is analyzed at the network edge, close to where it is generated, rather than being sent to the cloud. This allows systems to act on IoT data in milliseconds, while sending selected data to the cloud for historical analysis and longer-term storage.
Edge computing speeds up awareness and response to events. In industries such as manufacturing, oil and gas, utilities, transportation, mining, and the public sector, faster response times can improve output, boost service levels, and increase safety. Our edge/fog server cooling solutions are built with this in mind.
On a factory floor, a temperature sensor on a critical machine sends readings associated with imminent failure. A technician is dispatched to repair the machine in time to avoid a costly shutdown.
In oil and gas exploration, collected data can be processed in real time onsite instead of requiring transportation to a data center.
In utilities, ruggedized cameras at remote field substations detect an intruder and alert security officers. Almost instantaneous analysis reveals similar events at other substations, automatically raising the alert to the highest level.
In video production, compute-intensive processing can be done on location to reduce overall production time.
Problems with edge computing and solutions with a liquid cooling system
Current fog computing systems are limited by the processing power of devices at the very edge of the network so more intensive processing operations still have to be performed in the cloud. The servers performing these operations generate a substantial amount of heat, requiring a controlled data center environment to keep them cool.
The LCS Edge
LCS technology supports high-performance computing in the field without mechanical cooling, surmounting many of the key obstacles to implementing edge infrastructure. All the processing-intensive applications currently performed in the cloud can be pushed to the edge using LCS’s patented cooling system.
Mobile cloud computing (cloudlet)
A cloudlet is a mobility-enhanced small-scale cloud data center that is located at the edge of the Internet. The main purpose of the cloudlet is to support resource-intensive and interactive mobile applications by providing powerful computing resources to mobile devices with lower latency.
A cloudlet is a data center in a box that brings the cloud closer. It is a new architectural element that extends cloud computing infrastructure. It represents the middle tier of a 3-tier hierarchy: mobile device -> cloudlet -> cloud.
LCS brings the cloudlet to life, surmounting onerous and expensive facility requirements to deploy a data center in any environment.
LCS devices are built with portability in mind, requiring no air conditioning and occupying less than a third of the space required for an equivalent air-cooled setup. With the elimination of fans, silent operation and reduced temperature fluctuation, IT hardware can be installed in a variety of environments previously considered off-limits, including:
- Outdoors, in any climate
- Indoors, in the closet
- Dusty environments
- In the engine compartment of a vehicle
Example use cases
LCS devices are extraordinarily reliable, requiring minimal maintenance and dramatically increasing equipment lifespan.
- Grid computing
- Cluster computing
- Utility computing
- Parallel processing
- Sensor networks
- Distributed ledger (DLT)