The traditional production paradigm of large batch production does not offer flexibility towards satisfying the requirements of individual customers. A new generation of smart factories is expected to support new multi-variety and small-batch customized production modes. For that, Artificial Intelligence (AI) is enabling higher value-added manufacturing by accelerating the integration of manufacturing and information communication technologies, including computing, communication, and control. The characteristics of a customized smart factory are to include self-perception, operations optimization, dynamic reconfiguration, and intelligent decision-making. The AI technologies will allow manufacturing systems to perceive the environment, adapt to the external needs, and extract the process knowledge, including business models, such as intelligent production, networked collaboration, and extended service models. This paper focuses on the implementation of AI in customized manufacturing (CM). The architecture of an AI-driven customized smart factory is presented. Details of intelligent manufacturing devices, intelligent information interaction, and construction of a flexible manufacturing line are showcased. The state-of-the-art AI technologies of potential use in CM, i.e., machine learning, multi-agent systems, Internet of Things, big data, and cloud-edge computing are surveyed. The AI-enabled technologies in a customized smart factory are validated with a case study of customized packaging. The experimental results have demonstrated that the AI-assisted CM offers the possibility of higher production flexibility and efficiency. Challenges and solutions related to AI in CM are also discussed.
The accelerating convergence of artificial intelligence (AI) and Internet of Things (IoT) has sparked a recent wave of interest in Artificial Intelligence of Things (AIoT). This article covers everything you need to know about the basics of the Artificial Intelligence of Things. We will discuss how emerging technology drives the development of disruptive applications, software, sensors, and systems. AIoT stands for Artificial Intelligence of Things; it combines the connectivity from the Internet of Things (IoT) with the data-driven knowledge obtained from Artificial Intelligence (AI). This emerging technology is based on the integration of Artificial Intelligence in IoT infrastructure.
With the rapid pace of technological innovation, the need for greater market responsiveness, and the rising cost of labor in nearly all economies, many companies are revisiting age-old manufacturing strategies. They recognize there is a growing need to introduce innovative products faster to meet customer demands while maintaining aggressive cost and quality objectives. Traditional manufacturing approaches can no longer keep pace with this dynamic new consumer-driven age. Meeting these demands will instead require a complete reinvention to how we approach manufacturing, and this reinvention will need to unfold on a scale that amounts to a new industrial revolution. Welcome to the era of digital manufacturing, which can be defined simply as the growing application and impact of digital connectivity linking automation, workers and decision-makers.
When new industry buzzwords or phrases come up, the challenge for people like us who write about the topic is figuring out what exactly a company means, especially when it uses the phrase to fit its own marketing objective. The latest one is edge artificial intelligence or edge AI. Because of the proliferation of internet of things (IoT) and the ability to add a fair amount of compute power or processing to enable intelligence within those devices, the'edge' can be quite wide, and could mean anything from the'edge of a gateway' to an'endpoint'. So, we decided to find out if there was consensus in the industry on the definition of edge vs. endpoint, who would want to add edge AI, and how much'smartness' you could add to the edge. First of all, what is the difference between edge and endpoint? Well it depends on your viewpoint -- anything not in the cloud could be defined as edge. Probably the clearest definition was from Wolfgang Furtner, Infineon Technologies' senior principal for concept and system engineering.
In a little over a decade since researchers uncovered novel techniques to improve its efficiency and effectiveness, deep learning has become a practical technology that now underpins a number of applications that need artificial intelligence (AI). Many of these applications are hosted in the cloud on powerful servers, as tasks sometimes involve the processing of data-rich sources such as images, videos, and audio. Those servers often call on the additional performance of acceleration hardware, which range from graphics processing units to custom devices. These become particularly important for the numerically-intensive process, during which a neural network is trained on new data. Typically, the inferencing process, which uses a trained network to assess new data, is far less compute-intensive than training.