Goto

Collaborating Authors

Results


How cloud computing can improve 5G wireless networks

#artificialintelligence

A great deal has been written about the technologies fueling 5G, especially how those technologies will improve the experience that users have regarding connectivity. Similarly, much has been said about how ongoing developments in technology will usher in a new generation of network-aware applications. In this article, we discuss one key aspect of 5G technology and how it will impact the development of wireless network capacity. This is one of the more important but often neglected aspects of wireless communication evolution. It represents yet another important reason why the convergence of cloud computing and wireless communications makes so much sense.


AI and ML for Open RAN and 5G

#artificialintelligence

Fast, reliable, and low-latency data services are essential deliverables from telecom operators today. Realizing them is pushing operators to enhance infrastructure, expand network capacity and mitigate service degradation. Unlike other industries, though, telecom networks are vast monoliths comprising fiber optic cables, proprietary components, and legacy hardware. Because of this, there is less enhancing--and more shoring up the creaking infrastructure. Radio access networks (RAN) are the backbone of the telecommunications industry. However, the industry's propensity to incubate and evolve newer, cost-effective, and energy-efficient technologies has been slow due to monopolization by RAN component manufacturers.


Spectral goodness-of-fit tests for complete and partial network data

arXiv.org Machine Learning

Networks describe the, often complex, relationships between individual actors. In this work, we address the question of how to determine whether a parametric model, such as a stochastic block model or latent space model, fits a dataset well and will extrapolate to similar data. We use recent results in random matrix theory to derive a general goodness-of-fit test for dyadic data. We show that our method, when applied to a specific model of interest, provides an straightforward, computationally fast way of selecting parameters in a number of commonly used network models. For example, we show how to select the dimension of the latent space in latent space models. Unlike other network goodness-of-fit methods, our general approach does not require simulating from a candidate parametric model, which can be cumbersome with large graphs, and eliminates the need to choose a particular set of statistics on the graph for comparison. It also allows us to perform goodness-of-fit tests on partial network data, such as Aggregated Relational Data. We show with simulations that our method performs well in many situations of interest. We analyze several empirically relevant networks and show that our method leads to improved community detection algorithms. R code to implement our method is available on Github.


Active Learning for Network Traffic Classification: A Technical Survey

arXiv.org Artificial Intelligence

Network Traffic Classification (NTC) has become an important component in a wide variety of network management operations, e.g., Quality of Service (QoS) provisioning and security purposes. Machine Learning (ML) algorithms as a common approach for NTC methods can achieve reasonable accuracy and handle encrypted traffic. However, ML-based NTC techniques suffer from the shortage of labeled traffic data which is the case in many real-world applications. This study investigates the applicability of an active form of ML, called Active Learning (AL), which reduces the need for a high number of labeled examples by actively choosing the instances that should be labeled. The study first provides an overview of NTC and its fundamental challenges along with surveying the literature in the field of using ML techniques in NTC. Then, it introduces the concepts of AL, discusses it in the context of NTC, and review the literature in this field. Further, challenges and open issues in the use of AL for NTC are discussed. Additionally, as a technical survey, some experiments are conducted to show the broad applicability of AL in NTC. The simulation results show that AL can achieve high accuracy with a small amount of data.


3D UAV Trajectory and Data Collection Optimisation via Deep Reinforcement Learning

arXiv.org Artificial Intelligence

Unmanned aerial vehicles (UAVs) are now beginning to be deployed for enhancing the network performance and coverage in wireless communication. However, due to the limitation of their on-board power and flight time, it is challenging to obtain an optimal resource allocation scheme for the UAV-assisted Internet of Things (IoT). In this paper, we design a new UAV-assisted IoT systems relying on the shortest flight path of the UAVs while maximising the amount of data collected from IoT devices. Then, a deep reinforcement learning-based technique is conceived for finding the optimal trajectory and throughput in a specific coverage area. After training, the UAV has the ability to autonomously collect all the data from user nodes at a significant total sum-rate improvement while minimising the associated resources used. Numerical results are provided to highlight how our techniques strike a balance between the throughput attained, trajectory, and the time spent. More explicitly, we characterise the attainable performance in terms of the UAV trajectory, the expected reward and the total sum-rate.


Network and Physical Layer Attacks and countermeasures to AI-Enabled 6G O-RAN

arXiv.org Artificial Intelligence

Abstract--Artificial intelligence (AI) will play an increasing role in cellular network deployment, configuration and management. This paper examines the security implications of AI-driven 6G radio access networks (RANs). While the expected timeline for 6G standardization is still several years out, pre-standardization efforts related to 6G security are already ongoing and will benefit from fundamental and experimental research. The Open RAN (O-RAN) describes an industry-driven open architecture and interfaces for building next generation RANs with AI control. Considering this architecture, we identify the critical threats to data driven network and physical layer elements, the corresponding countermeasures, and the research directions. The steady increase in the number of connected devices and the heterogeneous types of communications performance demands have driven the wireless business and research and development (R&D) efforts.


[PDF] Machine Learning as a Service (MLaaS) Market : Some Ridiculously Simple Ways To Improve. - The Courier

#artificialintelligence

IT equipment consists of products such as Personal computers (PCs), servers, monitors, storage devices etc. Software comprises of computer programs, firmware and applications. The IT & business services segment is further classified into consulting, custom solutions development, outsourcing services etc. The telecommunication equipment segment consists of telecom equipments such as switches, routers etc. The carrier services segment comprises of operations related revenue spent by telecom service provider on acquiring telecom capacity, primarily from overseas carrier. How Important Is Machine Learning as a Service (MLaaS)?


Machine Learning for Performance Prediction of Channel Bonding in Next-Generation IEEE 802.11 WLANs

arXiv.org Artificial Intelligence

With the advent of Artificial Intelligence (AI)-empowered communications, industry, academia, and standardization organizations are progressing on the definition of mechanisms and procedures to address the increasing complexity of future 5G and beyond communications. In this context, the International Telecommunication Union (ITU) organized the first AI for 5G Challenge to bring industry and academia together to introduce and solve representative problems related to the application of Machine Learning (ML) to networks. In this paper, we present the results gathered from Problem Statement~13 (PS-013), organized by Universitat Pompeu Fabra (UPF), which primary goal was predicting the performance of next-generation Wireless Local Area Networks (WLANs) applying Channel Bonding (CB) techniques. In particular, we overview the ML models proposed by participants (including Artificial Neural Networks, Graph Neural Networks, Random Forest regression, and gradient boosting) and analyze their performance on an open dataset generated using the IEEE 802.11ax-oriented Komondor network simulator. The accuracy achieved by the proposed methods demonstrates the suitability of ML for predicting the performance of WLANs. Moreover, we discuss the importance of abstracting WLAN interactions to achieve better results, and we argue that there is certainly room for improvement in throughput prediction through ML.


Network Activities Recognition and Analysis Based on Supervised Machine Learning Classification Methods Using J48 and Na\"ive Bayes Algorithm

arXiv.org Artificial Intelligence

Network activities recognition has always been a significant component of intrusion detection. However, with the increasing network traffic flow and complexity of network behavior, it is becoming more and more difficult to identify the specific behavior quickly and accurately by user network monitoring software. It also requires the system security staff to pay close attention to the latest intrusion monitoring technology and methods. All of these greatly increase the difficulty and complexity of intrusion detection tasks. The application of machine learning methods based on supervised classification technology would help to liberate the network security staff from the heavy and boring tasks. A finetuned model would accurately recognize user behavior, which could provide persistent monitoring with a relative high accuracy and good adaptability. Finally, the results of network activities recognition by J48 and Na\"ive Bayes algorithms are introduced and evaluated.


FENXI: Deep-learning Traffic Analytics at the Edge

arXiv.org Artificial Intelligence

Live traffic analysis at the first aggregation point in the ISP network enables the implementation of complex traffic engineering policies but is limited by the scarce processing capabilities, especially for Deep Learning (DL) based analytics. The introduction of specialized hardware accelerators i.e., Tensor Processing Unit (TPU), offers the opportunity to enhance the processing capabilities of network devices at the edge. Yet, to date, no packet processing pipeline is capable of offering DL-based analysis capabilities in the data-plane, without interfering with network operations. In this paper, we present FENXI, a system to run complex analytics by leveraging TPU. The design of FENXI decouples forwarding operations and traffic analytics which operates at different granularities i.e., packet and flow levels. We conceive two independent modules that asynchronously communicate to exchange network data and analytics results, and design data structures to extract flow level statistics without impacting per-packet processing. We prototyped and evaluated FENXI on general-purpose servers considering both adversarial and realistic network conditions. Our analysis shows that FENXI can sustain 100 Gbps line rate traffic processing requiring only limited resources, while also dynamically adapting to variable network conditions.