Goto

Collaborating Authors

Machine Learning at the Network Edge: A Survey

arXiv.org Machine Learning

Devices comprising the Internet of Things, such as sensors and small cameras, usually have small memories and limited computational power. The proliferation of such resource-constrained devices in recent years has led to the generation of large quantities of data. These data-producing devices are appealing targets for machine learning applications but struggle to run machine learning algorithms due to their limited computing capability. They typically offload input data to external computing systems (such as cloud servers) for further processing. The results of the machine learning computations are communicated back to the resource-scarce devices, but this worsens latency, leads to increased communication costs, and adds to privacy concerns. Therefore, efforts have been made to place additional computing devices at the edge of the network, i.e close to the IoT devices where the data is generated. Deploying machine learning systems on such edge devices alleviates the above issues by allowing computations to be performed close to the data sources. This survey describes major research efforts where machine learning has been deployed at the edge of computer networks.


Engineering Tiny Machine Learning for the Edge - InformationWeek

#artificialintelligence

Edge is all about intelligence, but those smarts must be squeezed into ever tinier form factors. Developers of artificial intelligence (AI) applications must make sure that each new machine learning (ML) model they build is optimized for fast inferencing on one or more target platforms. Increasingly, these target environments are edge devices such as smartphones, smart cameras, drones, and embedded appliances, many of which have severely constrained processing, memory, storage, and other local hardware resources. The hardware constraints of smaller devices are problematic for the deep neural networks at the heart of more sophisticated AI apps. Many neural-net models can be quite large and complex. As a result, the processing, memory, and storage requirements for executing those models locally on edge devices may prove excessive for some mass-market applications that require low-cost commoditized chipsets.


How Edge AI is a Roadmap to Future AI and IoT Trends?

#artificialintelligence

Change has always been integral to development. With fast-evolving technologies, companies, too, need themselves to embrace these for maximized benefits. Artificial Intelligence (AI) moving to edge IoT devices and networks, just like we witnessed computing switch from mainframes to the cloud. And as data continues to grow, we need to opt for data storage and data computation to be located on the device. Companies like Qualcomm, NVIDIA, and Intel are helping us achieve this reality.


Embedded Development Boards for Edge-AI: A Comprehensive Report

arXiv.org Artificial Intelligence

The use of Deep Learning and Machine Learning is becoming pervasive day by day which is opening doors to new opportunities in every aspect of technology. Its application Ranges from Health-care to Self-driving Cars, Home Automation to Smart-agriculture, and Industry 4.0. Traditionally the majority of the processing for IoT applications is being done on a central cloud but that has its issues; which include latency, security, bandwidth, and privacy, etc. It is estimated that there will be around 20 Million IoT devices by 2020 which will increase problems with sending data to the cloud and doing the processing there. A new trend of processing the data on the edge of the network is emerging. The idea is to do processing as near the point of data production as possible. Doing processing on the nodes generating the data is called Edge Computing and doing processing on a layer between the cloud and the point of data production is called Fog computing. There are no standard definitions for any of these, hence they are usually used interchangeably. In this paper, we have reviewed the development boards available for running Artificial Intelligence algorithms on the Edge


Can Edge Analytics Become a Game Changer? - KDnuggets

#artificialintelligence

By Sciforce, software solutions based on science-driven information technologies. One of the major IoT trends for 2019 that are constantly mentioned in ratings and articles is edge analytics. It is considered to be the future of sensor handling, and it is already, at least in some cases, preferred over usual clouds. First of all, let's go deeper into the idea. Edge analytics refers to an approach to data collection and analysis in which an automated analytical computation is performed on data at a sensor, network switch, or another device instead of sending the data back to a centralized data store.