datastream
CyberCortex.AI: An AI-based Operating System for Autonomous Robotics and Complex Automation
Grigorescu, Sorin, Zaha, Mihai
The underlying framework for controlling autonomous robots and complex automation applications are Operating Systems (OS) capable of scheduling perception-and-control tasks, as well as providing real-time data communication to other robotic peers and remote cloud computers. In this paper, we introduce CyberCortex.AI, a robotics OS designed to enable heterogeneous AI-based robotics and complex automation applications. CyberCortex.AI is a decentralized distributed OS which enables robots to talk to each other, as well as to High Performance Computers (HPC) in the cloud. Sensory and control data from the robots is streamed towards HPC systems with the purpose of training AI algorithms, which are afterwards deployed on the robots. Each functionality of a robot (e.g. sensory data acquisition, path planning, motion control, etc.) is executed within a so-called DataBlock of Filters shared through the internet, where each filter is computed either locally on the robot itself, or remotely on a different robotic system. The data is stored and accessed via a so-called \textit{Temporal Addressable Memory} (TAM), which acts as a gateway between each filter's input and output. CyberCortex.AI has two main components: i) the CyberCortex.AI.inference system, which is a real-time implementation of the DataBlock running on the robots' embedded hardware, and ii) the CyberCortex.AI.dojo, which runs on an HPC computer in the cloud, and it is used to design, train and deploy AI algorithms. We present a quantitative and qualitative performance analysis of the proposed approach using two collaborative robotics applications: \textit{i}) a forest fires prevention system based on an Unitree A1 legged robot and an Anafi Parrot 4K drone, as well as \textit{ii}) an autonomous driving system which uses CyberCortex.AI for collaborative perception and motion control.
- North America > United States (0.28)
- South America > Uruguay > Maldonado > Maldonado (0.04)
- Europe > Romania > Centru Development Region > Brașov County > Brașov (0.04)
- (2 more...)
- Automobiles & Trucks (1.00)
- Information Technology > Robotics & Automation (0.88)
- Transportation > Ground > Road (0.48)
Anomaly Detection in Time Series of EDFA Pump Currents to Monitor Degeneration Processes using Fuzzy Clustering
Schneider, Dominic, Rapp, Lutz, Ament, Christoph
This article proposes a novel fuzzy clustering based anomaly detection method for pump current time series of EDFA systems. The proposed change detection framework (CDF) strategically combines the advantages of entropy analysis (EA) and principle component analysis (PCA) with fuzzy clustering procedures. In the framework, EA is applied for dynamic selection of features for reduction of the feature space and increase of computational performance. Furthermore, PCA is utilized to extract features from the raw feature space to enable generalization capability of the subsequent fuzzy clustering procedures. Three different fuzzy clustering methods, more precisely the fuzzy clustering algorithm, a probabilistic clustering algorithm and a possibilistic clustering algorithm are evaluated for performance and generalization. Hence, the proposed framework has the innovative feature to detect changes in pump current time series at an early stage for arbitrary points of operation, compared to state-of-the-art predefined alarms in commercially used EDFAs. Moreover, the approach is implemented and tested using experimental data. In addition, the proposed framework enables further approaches of applying decentralized predictive maintenance for optical fiber networks.
Disentangling Learnable and Memorizable Data via Contrastive Learning for Semantic Communications
Chaccour, Christina, Saad, Walid
Achieving artificially intelligent-native wireless networks is necessary for the operation of future 6G applications such as the metaverse. Nonetheless, current communication schemes are, at heart, a mere reconstruction process that lacks reasoning. One key solution that enables evolving wireless communication to a human-like conversation is semantic communications. In this paper, a novel machine reasoning framework is proposed to pre-process and disentangle source data so as to make it semantic-ready. In particular, a novel contrastive learning framework is proposed, whereby instance and cluster discrimination are performed on the data. These two tasks enable increasing the cohesiveness between data points mapping to semantically similar content elements and disentangling data points of semantically different content elements. Subsequently, the semantic deep clusters formed are ranked according to their level of confidence. Deep semantic clusters of highest confidence are considered learnable, semantic-rich data, i.e., data that can be used to build a language in a semantic communications system. The least confident ones are considered, random, semantic-poor, and memorizable data that must be transmitted classically. Our simulation results showcase the superiority of our contrastive learning approach in terms of semantic impact and minimalism. In fact, the length of the semantic representation achieved is minimized by 57.22% compared to vanilla semantic communication systems, thus achieving minimalist semantic representations.
- North America > United States > Virginia > Arlington County > Arlington (0.04)
- North America > United States > Utah > Salt Lake County > Salt Lake City (0.04)
- Europe > Spain > Galicia > Madrid (0.04)
Less Data, More Knowledge: Building Next Generation Semantic Communication Networks
Chaccour, Christina, Saad, Walid, Debbah, Merouane, Han, Zhu, Poor, H. Vincent
Semantic communication is viewed as a revolutionary paradigm that can potentially transform how we design and operate wireless communication systems. However, despite a recent surge of research activities in this area, the research landscape remains limited. In this tutorial, we present the first rigorous vision of a scalable end-to-end semantic communication network that is founded on novel concepts from artificial intelligence (AI), causal reasoning, and communication theory. We first discuss how the design of semantic communication networks requires a move from data-driven networks towards knowledge-driven ones. Subsequently, we highlight the necessity of creating semantic representations of data that satisfy the key properties of minimalism, generalizability, and efficiency so as to do more with less. We then explain how those representations can form the basis a so-called semantic language. By using semantic representation and languages, we show that the traditional transmitter and receiver now become a teacher and apprentice. Then, we define the concept of reasoning by investigating the fundamentals of causal representation learning and their role in designing semantic communication networks. We demonstrate that reasoning faculties are majorly characterized by the ability to capture causal and associational relationships in datastreams. For such reasoning-driven networks, we propose novel and essential semantic communication metrics that include new "reasoning capacity" measures that could go beyond Shannon's bound to capture the convergence of computing and communication. Finally, we explain how semantic communications can be scaled to large-scale networks (6G and beyond). In a nutshell, we expect this tutorial to provide a comprehensive reference on how to properly build, analyze, and deploy future semantic communication networks.
- Europe > United Kingdom > England > Cambridgeshire > Cambridge (0.04)
- Europe > Spain > Galicia > Madrid (0.04)
- North America > United States > Washington > King County > Seattle (0.04)
- (14 more...)
- Instructional Material > Course Syllabus & Notes (0.67)
- Research Report > Promising Solution (0.48)
- Telecommunications (0.93)
- Leisure & Entertainment (0.67)
- Information Technology (0.67)
Class Distribution Monitoring for Concept Drift Detection
Stucchi, Diego, Frittoli, Luca, Boracchi, Giacomo
We introduce Class Distribution Monitoring (CDM), an effective concept-drift detection scheme that monitors the class-conditional distributions of a datastream. In particular, our solution leverages multiple instances of an online and nonparametric change-detection algorithm based on QuantTree. CDM reports a concept drift after detecting a distribution change in any class, thus identifying which classes are affected by the concept drift. This can be precious information for diagnostics and adaptation. Our experiments on synthetic and real-world datastreams show that when the concept drift affects a few classes, CDM outperforms algorithms monitoring the overall data distribution, while achieving similar detection delays when the drift affects all the classes. Moreover, CDM outperforms comparable approaches that monitor the classification error, particularly when the change is not very apparent. Finally, we demonstrate that CDM inherits the properties of the underlying change detector, yielding an effective control over the expected time before a false alarm, or Average Run Length (ARL$_0$).
Nonparametric and Online Change Detection in Multivariate Datastreams using QuantTree
Frittoli, Luca, Carrera, Diego, Boracchi, Giacomo
We address the problem of online change detection in multivariate datastreams, and we introduce QuantTree Exponentially Weighted Moving Average (QT-EWMA), a nonparametric change-detection algorithm that can control the expected time before a false alarm, yielding a desired Average Run Length (ARL$_0$). Controlling false alarms is crucial in many applications and is rarely guaranteed by online change-detection algorithms that can monitor multivariate datastreams without knowing the data distribution. Like many change-detection algorithms, QT-EWMA builds a model of the data distribution, in our case a QuantTree histogram, from a stationary training set. To monitor datastreams even when the training set is extremely small, we propose QT-EWMA-update, which incrementally updates the QuantTree histogram during monitoring, always keeping the ARL$_0$ under control. Our experiments, performed on synthetic and real-world datastreams, demonstrate that QT-EWMA and QT-EWMA-update control the ARL$_0$ and the false alarm rate better than state-of-the-art methods operating in similar conditions, achieving lower or comparable detection delays.
- North America > United States > California (0.04)
- Europe > Finland > Pirkanmaa > Tampere (0.04)
- Research Report > New Finding (0.46)
- Research Report > Promising Solution (0.34)
Google's Data Cloud Summit Serves Up A Breadth Of New Capabilities
I mentioned in my preview of the Google Data Cloud Summit this week that I was expecting some exciting technology announcements for AI, machine learning, data management, and analytics. Google did not disappoint in that department. The Google Cloud stated mission is to accelerate every organization's ability to transform through data-powered innovation. A theme that should come as no surprise to anyone, but in Google's case, supported by a slew of new technologies and innovations. In this article, I will unpack and analyze some of the announcements from this busy week.
- Semiconductors & Electronics (1.00)
- Information Technology > Services (1.00)
- Information Technology > Cloud Computing (1.00)
- Information Technology > Data Science > Data Mining (0.71)
- Information Technology > Artificial Intelligence > Machine Learning (0.53)
Change Detection in Multivariate Datastreams: Likelihood and Detectability Loss
Alippi, Cesare, Boracchi, Giacomo, Carrera, Diego, Roveri, Manuel
We address the problem of detecting changes in multivariate datastreams, and we investigate the intrinsic difficulty that change-detection methods have to face when the data dimension scales. In particular, we consider a general approach where changes are detected by comparing the distribution of the log-likelihood of the datastream over different time windows. Despite the fact that this approach constitutes the frame of several change-detection methods, its effectiveness when data dimension scales has never been investigated, which is indeed the goal of our paper. We show that the magnitude of the change can be naturally measured by the symmetric Kullback-Leibler divergence between the pre- and post-change distributions, and that the detectability of a change of a given magnitude worsens when the data dimension increases. This problem, which we refer to as \emph{detectability loss}, is due to the linear relationship between the variance of the log-likelihood and the data dimension. We analytically derive the detectability loss on Gaussian-distributed datastreams, and empirically demonstrate that this problem holds also on real-world datasets and that can be harmful even at low data-dimensions (say, 10).