Plotting

Results


Three opportunities of Digital Transformation: AI, IoT and Blockchain

#artificialintelligence

Koomey's law This law posits that the energy efficiency of computation doubles roughly every one-and-a-half years (see Figure 1–7). In other words, the energy necessary for the same amount of computation halves in that time span. To visualize the exponential impact this has, consider the face that a fully charged MacBook Air, when applying the energy efficiency of computation of 1992, would completely drain its battery in a mere 1.5 seconds. According to Koomey's law, the energy requirements for computation in embedded devices is shrinking to the point that harvesting the required energy from ambient sources like solar power and thermal energy should suffice to power the computation necessary in many applications. Metcalfe's law This law has nothing to do with chips, but all to do with connectivity. Formulated by Robert Metcalfe as he invented Ethernet, the law essentially states that the value of a network increases exponentially with regard to the number of its nodes (see Figure 1–8).


GEMEL: Model Merging for Memory-Efficient, Real-Time Video Analytics at the Edge

arXiv.org Artificial Intelligence

Video analytics pipelines have steadily shifted to edge deployments to reduce bandwidth overheads and privacy violations, but in doing so, face an ever-growing resource tension. Most notably, edge-box GPUs lack the memory needed to concurrently house the growing number of (increasingly complex) models for real-time inference. Unfortunately, existing solutions that rely on time/space sharing of GPU resources are insufficient as the required swapping delays result in unacceptable frame drops and accuracy violations. We present model merging, a new memory management technique that exploits architectural similarities between edge vision models by judiciously sharing their layers (including weights) to reduce workload memory costs and swapping delays. Our system, GEMEL, efficiently integrates merging into existing pipelines by (1) leveraging several guiding observations about per-model memory usage and inter-layer dependencies to quickly identify fruitful and accuracy-preserving merging configurations, and (2) altering edge inference schedules to maximize merging benefits. Experiments across diverse workloads reveal that GEMEL reduces memory usage by up to 60.7%, and improves overall accuracy by 8-39% relative to time/space sharing alone.


Learning, Computing, and Trustworthiness in Intelligent IoT Environments: Performance-Energy Tradeoffs

arXiv.org Artificial Intelligence

An Intelligent IoT Environment (iIoTe) is comprised of heterogeneous devices that can collaboratively execute semi-autonomous IoT applications, examples of which include highly automated manufacturing cells or autonomously interacting harvesting machines. Energy efficiency is key in such edge environments, since they are often based on an infrastructure that consists of wireless and battery-run devices, e.g., e-tractors, drones, Automated Guided Vehicle (AGV)s and robots. The total energy consumption draws contributions from multiple iIoTe technologies that enable edge computing and communication, distributed learning, as well as distributed ledgers and smart contracts. This paper provides a state-of-the-art overview of these technologies and illustrates their functionality and performance, with special attention to the tradeoff among resources, latency, privacy and energy consumption. Finally, the paper provides a vision for integrating these enabling technologies in ...


An Internet of Things Service Roadmap

Communications of the ACM

The Internet of things (IoT) is taking the world by storm, thanks to the proliferation of sensors and actuators embedded in everyday things, coupled with the wide availability of high-speed Internet50 and evolution of the 5th-generation (5G) networks.34 IoT devices are increasingly supplying information about the physical environment (for example, infrastructure, assets, homes, and cars). The advent of IoT is enabling not only the connection and integration of devices that monitor physical world phenomena (for example, temperature, pollution, energy consumption, human activities, and movement), but also data-driven and AI-augmented intelligence. At all levels, synergies from advances in IoT, data analytics, and artificial intelligence (AI) are firmly recognized as strategic priorities for digital transformation.10,41,50 IoT poses two key challenges:36 Communication with things and management of things.41 The service paradigm is a key mechanism to overcome these challenges by transforming IoT devices into IoT services, where they will be treated as first-class objects through the prism of services.9 In a nutshell, services are at a higher level of abstraction than data. Services descriptions consist of two parts: functional and non-functional, such as, Quality of Service (QoS) attributes.27 Services often transform data into an actionable knowledge or achieve physical state changes in the operating context.9 As a result, the service paradigm is the perfect basis for understanding the transformation of data into actionable knowledge, that is, making it useful. Despite the increasing uptake of IoT services, most organizations have not yet mastered the requisite knowledge, skills, or understanding to craft a successful IoT strategy.


Best practices to build data literacy into your Gen Z workforce - Data Dreamer

#artificialintelligence

This is a guest post by Kirk Borne, Ph.D., Chief Science Officer at DataPrime.ai, Kirk is also a consultant, astrophysicist, data scientist, blogger, data literacy advocate and renowned speaker, and is one of the most recognized names in the industry. A survey of 1,100 data practitioners and business leaders reported that 84% of organizations consider data literacy to be a core business skill, agreeing with the statement that the inability of the workforce to use and analyze data effectively can hamper their business success. In addition, 36% said data literacy is crucial to future-proofing their business. Another survey found that 75% of employees are not comfortable using data.


A Review on Edge Analytics: Issues, Challenges, Opportunities, Promises, Future Directions, and Applications

arXiv.org Artificial Intelligence

Edge technology aims to bring Cloud resources (specifically, the compute, storage, and network) to the closed proximity of the Edge devices, i.e., smart devices where the data are produced and consumed. Embedding computing and application in Edge devices lead to emerging of two new concepts in Edge technology, namely, Edge computing and Edge analytics. Edge analytics uses some techniques or algorithms to analyze the data generated by the Edge devices. With the emerging of Edge analytics, the Edge devices have become a complete set. Currently, Edge analytics is unable to provide full support for the execution of the analytic techniques. The Edge devices cannot execute advanced and sophisticated analytic algorithms following various constraints such as limited power supply, small memory size, limited resources, etc. This article aims to provide a detailed discussion on Edge analytics. A clear explanation to distinguish between the three concepts of Edge technology, namely, Edge devices, Edge computing, and Edge analytics, along with their issues. Furthermore, the article discusses the implementation of Edge analytics to solve many problems in various areas such as retail, agriculture, industry, and healthcare. In addition, the research papers of the state-of-the-art edge analytics are rigorously reviewed in this article to explore the existing issues, emerging challenges, research opportunities and their directions, and applications.


Top 25 AI chip companies: A macro step change inferred from the micro scale

#artificialintelligence

One of the effects of the ongoing trade war between the US and China is likely to be the accelerated development of what are being called "artificial intelligence chips", or AI chips for short, also sometimes referred to as AI accelerators. AI chips could play a critical role in economic growth going forward because they will inevitably feature in cars, which are becoming increasingly autonomous; smart homes, where electronic devices are becoming more intelligent; robotics, obviously; and many other technologies. AI chips, as the term suggests, refers to a new generation of microprocessors which are specifically designed to process artificial intelligence tasks faster, using less power. Obvious, you might think, but some might wonder what the difference between an AI chip and a regular chip would be when all chips of any type process zeros and ones – a typical processor, after all, is actually capable of AI tasks. Graphics-processing units are particularly good at AI-like tasks, which is why they form the basis for many of the AI chips being developed and offered today. Without getting out of our depth, while a general microprocessor is an all-purpose system, AI processors are embedded with logic gates and highly parallel calculation systems that are more suited to typical AI tasks such as image processing, machine vision, machine learning, deep learning, artificial neural networks, and so on. Maybe one could use cars as metaphors. A general microprocessor is your typical family car that might have good speed and steering capabilities.


Global Big Data Conference

#artificialintelligence

Today, technologies like Artificial Intelligence (AI) and Machine Learning (ML) have integrated into our lives in such a way that it is impossible to imagine a world without them. Think about the smart virtual assistants (Siri and Alexa), the recommendation engines on online shopping platforms (Amazon and Netflix), self-driving cars and smart homes, they all are applications of ML. Certainly, the inclusion of these radical technological innovations has made our lives so much more comfortable. Although ML has been around for a long time (for instance, Turing's Enigma machine), it's only recently that the interest in this concept has peaked. As more companies are getting inclined towards advanced ML solutions and technologies, it is encouraging students and professionals to take up a machine learning course.


The Leading Industry 4.0 Companies 2019 - Vendor Map

#artificialintelligence

While conducting research for the recently released Industry 4.0 and Smart Manufacturing Market Report, IoT Analytics identified 300 leading Industry 4.0 companies that supply cutting edge products and services that are driving the fourth industrial revolution. The leading Industry 4.0 companies were selected based on a number of criteria (case studies, product offerings, estimated market share, etc.) and were categorized based on what type of Industry 4.0 product or service they supplied. Building on its long history of supporting industrial automation companies, Microsoft has emerged as the hosting partner of choice for many Industry 4.0 companies. Both end users (manufacturing facilities) and suppliers (OEMs, industrial automation companies, etc.) have partnered with Microsoft to develop and run mission-critical on-premise SCADA and MES applications for decades. Microsoft's deep domain knowledge and technical capabilities (especially with respect to hybrid cloud solutions) have helped it become a leading provider of hosting services for major manufacturing end users and suppliers such as Siemens, PTC, GE, and Emerson.


AI And Big Data: Two Major Parts of The Digital Future

#artificialintelligence

If you follow technology news in any way, you've undoubtedly noticed that AI and Big Data are trending topics. Both technologies are certainly the driving force behind a variety of tech innovations. In the following paragraphs, we'll explore exactly what AI and big data are, how they work together, and the ways in which both will disrupt the digital future. Artificial intelligence is the technology that allows computers to do things that were once only the domain of humans. For example, computers have always been able to calculate.