Goto

Collaborating Authors

 Manias, Dimitrios Michael


Network Resource Optimization for ML-Based UAV Condition Monitoring with Vibration Analysis

arXiv.org Artificial Intelligence

ACCEPTED IN: IEEE NETWORKING LETTERS 1 Network Resource Optimization for ML-Based UA V Condition Monitoring with Vibration Analysis Alexandre Gemayel, Dimitrios Michael Manias, and Abdallah Shami Abstract --As smart cities begin to materialize, the role of Unmanned Aerial V ehicles (UA Vs) and their reliability becomes increasingly important. One aspect of reliability relates to Condition Monitoring (CM), where Machine Learning (ML) models are leveraged to identify abnormal and adverse conditions. Given the resource-constrained nature of next-generation edge networks, the utilization of precious network resources must be minimized. This work explores the optimization of network resources for ML-based UA V CM frameworks. The developed framework uses experimental data and varies the feature extraction aggregation interval to optimize ML model selection. Additionally, by leveraging dimensionality reduction techniques, there is a 99.9% reduction in network resource consumption. I NTRODUCTION E MERGING Unmanned Aerial V ehicle (UA V) applications, such as Smart Cities, have highlighted the necessity of real-time Condition Monitoring (CM) through Anomaly Detection (AD) and health analytics to ensure operational safety and integrity [1].


Towards Intent-Based Network Management: Large Language Models for Intent Extraction in 5G Core Networks

arXiv.org Artificial Intelligence

The integration of Machine Learning and Artificial Intelligence (ML/AI) into fifth-generation (5G) networks has made evident the limitations of network intelligence with ever-increasing, strenuous requirements for current and next-generation devices. This transition to ubiquitous intelligence demands high connectivity, synchronicity, and end-to-end communication between users and network operators, and will pave the way towards full network automation without human intervention. Intent-based networking is a key factor in the reduction of human actions, roles, and responsibilities while shifting towards novel extraction and interpretation of automated network management. This paper presents the development of a custom Large Language Model (LLM) for 5G and next-generation intent-based networking and provides insights into future LLM developments and integrations to realize end-to-end intent-based networking for fully automated network intelligence.


Semantic Routing for Enhanced Performance of LLM-Assisted Intent-Based 5G Core Network Management and Orchestration

arXiv.org Artificial Intelligence

Large language models (LLMs) are rapidly emerging in Artificial Intelligence (AI) applications, especially in the fields of natural language processing and generative AI. Not limited to text generation applications, these models inherently possess the opportunity to leverage prompt engineering, where the inputs of such models can be appropriately structured to articulate a model's purpose explicitly. A prominent example of this is intent-based networking, an emerging approach for automating and maintaining network operations and management. This paper presents semantic routing to achieve enhanced performance in LLM-assisted intent-based management and orchestration of 5G core networks. This work establishes an end-to-end intent extraction framework and presents a diverse dataset of sample user intents accompanied by a thorough analysis of the effects of encoders and quantization on overall system performance. The results show that using a semantic router improves the accuracy and efficiency of the LLM deployment compared to stand-alone LLMs with prompting architectures.


Machine Learning for Pre/Post Flight UAV Rotor Defect Detection Using Vibration Analysis

arXiv.org Artificial Intelligence

Unmanned Aerial Vehicles (UAVs) will be critical infrastructural components of future smart cities. In order to operate efficiently, UAV reliability must be ensured by constant monitoring for faults and failures. To this end, the work presented in this paper leverages signal processing and Machine Learning (ML) methods to analyze the data of a comprehensive vibrational analysis to determine the presence of rotor blade defects during pre and post-flight operation. With the help of dimensionality reduction techniques, the Random Forest algorithm exhibited the best performance and detected defective rotor blades perfectly. Additionally, a comprehensive analysis of the impact of various feature subsets is presented to gain insight into the factors affecting the model's classification decision process.


Leveraging Large Language Models for DRL-Based Anti-Jamming Strategies in Zero Touch Networks

arXiv.org Artificial Intelligence

As the dawn of sixth-generation (6G) networking approaches, it promises unprecedented advancements in communication and automation. Among the leading innovations of 6G is the concept of Zero Touch Networks (ZTNs), aiming to achieve fully automated, self-optimizing networks with minimal human intervention. Despite the advantages ZTNs offer in terms of efficiency and scalability, challenges surrounding transparency, adaptability, and human trust remain prevalent. Concurrently, the advent of Large Language Models (LLMs) presents an opportunity to elevate the ZTN framework by bridging the gap between automated processes and human-centric interfaces. This paper explores the integration of LLMs into ZTNs, highlighting their potential to enhance network transparency and improve user interactions. Through a comprehensive case study on deep reinforcement learning (DRL)-based anti-jamming technique, we demonstrate how LLMs can distill intricate network operations into intuitive, human-readable reports. Additionally, we address the technical and ethical intricacies of melding LLMs with ZTNs, with an emphasis on data privacy, transparency, and bias reduction. Looking ahead, we identify emerging research avenues at the nexus of LLMs and ZTNs, advocating for sustained innovation and interdisciplinary synergy in the domain of automated networks.


PWPAE: An Ensemble Framework for Concept Drift Adaptation in IoT Data Streams

arXiv.org Artificial Intelligence

Abstract--As the number of Internet of Things (IoT) devices and systems have surged, IoT data analytics techniques have been developed to detect malicious cyber-attacks and secure IoT systems; however, concept drift issues often occur in IoT data analytics, as IoT data is often dynamic data streams that change over time, causing model degradation and attack detection failure. This is because traditional data analytics models are static models that cannot adapt to data distribution changes. In this paper, we propose a Performance Weighted Probability Averaging Ensemble (PWPAE) framework for drift adaptive IoT anomaly detection through IoT data stream analytics. Experiments on two public datasets show the effectiveness of our proposed PWPAE method compared against state-of-the-art methods. With the rapid development of the Internet of Things traffic data often changes unpredictably over time, known (IoT), IoT devices have provided numerous new capabilities as concept drift [15] [16].


The Need for Advanced Intelligence in NFV Management and Orchestration

arXiv.org Artificial Intelligence

With the constant demand for connectivity at an all-time high, Network Service Providers (NSPs) are required to optimize their networks to cope with rising capital and operational expenditures required to meet the growing connectivity demand. A solution to this challenge was presented through Network Function Virtualization (NFV). As network complexity increases and futuristic networks take shape, NSPs are required to incorporate an increasing amount of operational efficiency into their NFV-enabled networks. One such technique is Machine Learning (ML), which has been applied to various entities in NFV-enabled networks, most notably in the NFV Orchestrator. While traditional ML provides tremendous operational efficiencies, including real-time and high-volume data processing, challenges such as privacy, security, scalability, transferability, and concept drift hinder its widespread implementation. Through the adoption of Advanced Intelligence techniques such as Reinforcement Learning and Federated Learning, NSPs can leverage the benefits of traditional ML while simultaneously addressing the major challenges traditionally associated with it. This work presents the benefits of adopting these advanced techniques, provides a list of potential use cases and research topics, and proposes a bottom-up micro-functionality approach to applying these methods of Advanced Intelligence to NFV Management and Orchestration.


Depth-Optimized Delay-Aware Tree (DO-DAT) for Virtual Network Function Placement

arXiv.org Artificial Intelligence

With the constant increase in demand for data connectivity, network service providers are faced with the task of reducing their capital and operational expenses while ensuring continual improvements to network performance. Although Network Function Virtualization (NFV) has been identified as a solution, several challenges must be addressed to ensure its feasibility. In this paper, we present a machine learning-based solution to the Virtual Network Function (VNF) placement problem. This paper proposes the Depth-Optimized Delay-Aware Tree (DO-DAT) model by using the particle swarm optimization technique to optimize decision tree hyper-parameters. Using the Evolved Packet Core (EPC) as a use case, we evaluate the performance of the model and compare it to a previously proposed model and a heuristic placement strategy.