Edge computing allows data produced by internet of things (IoT) devices to be processed closer to where it is created instead of sending it across long routes to data centers or clouds. Doing this computing closer to the edge of the network lets organizations analyze important data in near real-time – a need of organizations across many industries, including manufacturing, health care, telecommunications and finance. "In most scenarios, the presumption that everything will be in the cloud with a strong and stable fat pipe between the cloud and the edge device – that's just not realistic," says Helder Antunes, senior director of corporate strategic innovation at Cisco. Edge computing is a "mesh network of micro data centers that process or store critical data locally and push all received data to a central data center or cloud storage repository, in a footprint of less than 100 square feet," according to research firm IDC. It is typically referred to in IoT use cases, where edge devices would collect data – sometimes massive amounts of it – and send it all to a data center or cloud for processing.
The following is a guest article from Dr. Rob MacInnis, CEO and founder of AetherWorks. When it comes to processing and storing data, should we expect cloud to continue to reign as the go-to option of 2017? The data suggest that this might be the case. For example, spending on public cloud storage is predicted to reach 17% of total enterprise storage by 2017, up from 8% today. IT spending on cloud infrastructure, according to IDC, will exceed $37 billion in 2016, an increase of more than 15% from the previous year and, by 2020, nearly equal the amount spent on traditional, on-premises IT.
Edge Computing (EC) allows data generated by the Internet of Things (IoT) to be processed near its source, rather than sending the data great distances, to data centers or a Cloud. More specifically, Edge Computing uses a network of micro-data stations to process or store the data locally, within a range of 100 square feet. Prior to Edge Computing, it was assumed all data would be sent to the Cloud using a large and stable pipeline between the edge/IoT device and the Cloud.
Internet of Things (IoT) analytics refers to the collection and analysis of data stemming from a large number of heterogeneous Internet connected objects. IoT analytics is an integral element of the vast majority of IoT applications, which process data in order to offer data-intensive services or to drive actuation and control decisions. The Velocity of IoT data streams is usually the attribute that differentiates IoT analytics systems from the majority of conventional BigData systems, which handle large volumes of transactional data. Therefore, IoT analytics systems are usually supported by middleware frameworks for streaming data (such as the open source Apache Storm, Spark and Flink frameworks), rather than the popular MapReduce BigData processing framework. Given their Big Data nature, IoT analytics systems are usually integrated with Cloud computing infrastructures, in order to take advantage of the scalability, storage capacity and processing performance of the Cloud.
Cloud computing also often simply referred to as "cloud" represents a group of servers and computers connected to each other over the internet to create a large distributed infrastructure, which can deliver on demand services over the internet on a pay-for-use basis. Most of the cloud computing services could be split into three big categories: Infrastructure-as-a-service (IaaS), Platform-as-a-service (PaaS) and Software-as-a-service (SaaS). The focus of this article will mainly be on IaaS, space dominated by players like Amazon, Microsoft or Google. Today there is a large spectrum of advantages that drive companies to go for a cloud infrastructure. For many traditional scenarios where you don't have an extreme volume of data, classical cloud computing architecture is covering the demand.