Telecommunications
Graph Theory and Optimization Problems for Very Large Networks
Graph theory provides a primary tool for analyzing and designing computer communication networks. In the past few decades, Graph theory has been used to study various types of networks, including the Internet, wide Area Networks, Local Area Networks, and networking protocols such as border Gateway Protocol, Open shortest Path Protocol, and Networking Networks. In this paper, we present some key graph theory concepts used to represent different types of networks. Then we describe how networks are modeled to investigate problems related to network protocols. Finally, we present some of the tools used to generate graph for representing practical networks.
Impact of Cognitive Radio on Future Management of Spectrum
Cognitive radio is a breakthrough technology which is expected to have a profound impact on the way radio spectrum will be accessed, managed and shared in the future. In this paper I examine some of the implications of cognitive radio for future management of spectrum. Both a near-term view involving the opportunistic spectrum access model and a longer-term view involving a self-regulating dynamic spectrum access model within a society of cognitive radios are discussed.
On the Geometry of Discrete Exponential Families with Application to Exponential Random Graph Models
Fienberg, Stephen E., Rinaldo, Alessandro, Zhou, Yi
There has been an explosion of interest in statistical models for analyzing network data, and considerable interest in the class of exponential random graph (ERG) models, especially in connection with difficulties in computing maximum likelihood estimates. The issues associated with these difficulties relate to the broader structure of discrete exponential families. This paper re-examines the issues in two parts. First we consider the closure of $k$-dimensional exponential families of distribution with discrete base measure and polyhedral convex support $\mathrm{P}$. We show that the normal fan of $\mathrm{P}$ is a geometric object that plays a fundamental role in deriving the statistical and geometric properties of the corresponding extended exponential families. We discuss its relevance to maximum likelihood estimation, both from a theoretical and computational standpoint. Second, we apply our results to the analysis of ERG models. In particular, by means of a detailed example, we provide some characterization of the properties of ERG models, and, in particular, of certain behaviors of ERG models known as degeneracy.
Hierarchical structure and the prediction of missing links in networks
Clauset, Aaron, Moore, Cristopher, Newman, M. E. J.
Networks have in recent years emerged as an invaluable tool for describing and quantifying complex systems in many branches of science. Recent studies suggest that networks often exhibit hierarchical organization, where vertices divide into groups that further subdivide into groups of groups, and so forth over multiple scales. In many cases these groups are found to correspond to known functional units, such as ecological niches in food webs, modules in biochemical networks (protein interaction networks, metabolic networks, or genetic regulatory networks), or communities in social networks. Here we present a general technique for inferring hierarchical structure from network data and demonstrate that the existence of hierarchy can simultaneously explain and quantitatively reproduce many commonly observed topological properties of networks, such as right-skewed degree distributions, high clustering coefficients, and short path lengths. We further show that knowledge of hierarchical structure can be used to predict missing connections in partially known networks with high accuracy, and for more general network structures than competing techniques. Taken together, our results suggest that hierarchy is a central organizing principle of complex networks, capable of offering insight into many network phenomena.
Collective Classification in Network Data
Sen, Prithviraj (University of Maryland) | Namata, Galileo (University of Maryland) | Bilgic, Mustafa (University of Maryland) | Getoor, Lise (University of Maryland) | Galligher, Brian (University of Maryland) | Eliassi-Rad, Tina (University of Maryland)
Many real-world applications produce networked data such as the world-wide web (hypertext documents connected via hyperlinks), social networks (for example, people connected by friendship links), communication networks (computers connected via communication links) and biological networks (for example, protein interaction networks). A recent focus in machine learning research has been to extend traditional machine learning classification techniques to classify nodes in such networks. In this article, we provide a brief introduction to this area of research and how it has progressed during the past decade. We introduce four of the most widely used inference algorithms for classifying networked data and empirically compare them on both synthetic and real-world data.
Randomized Distributed Configuration Management of Wireless Networks: Multi-layer Markov Random Fields and Near-Optimality
Distributed configuration management is imperative for wireless infrastructureless networks where each node adjusts locally its physical and logical configuration through information exchange with neighbors. Two issues remain open. The first is the optimality. The second is the complexity. We study these issues through modeling, analysis, and randomized distributed algorithms. Modeling defines the optimality. We first derive a global probabilistic model for a network configuration which characterizes jointly the statistical spatial dependence of a physical- and a logical-configuration. We then show that a local model which approximates the global model is a two-layer Markov Random Field or a random bond model. The complexity of the local model is the communication range among nodes. The local model is near-optimal when the approximation error to the global model is within a given error bound. We analyze the trade-off between an approximation error and complexity, and derive sufficient conditions on the near-optimality of the local model. We validate the model, the analysis and the randomized distributed algorithms also through simulation.
The Role of Artificial Intelligence Technologies in Crisis Response
Khalil, Khaled M., Abdel-Aziz, M., Nazmy, Taymour T., Salem, Abdel-Badeeh M.
Crisis response poses many of the most difficult information technology in crisis management. It requires information and communication-intensive efforts, utilized for reducing uncertainty, calculating and comparing costs and benefits, and managing resources in a fashion beyond those regularly available to handle routine problems. In this paper, we explore the benefits of artificial intelligence technologies in crisis response. This paper discusses the role of artificial intelligence technologies; namely, robotics, ontology and semantic web, and multi-agent systems in crisis response.
Unicast and Multicast Qos Routing with Soft Constraint Logic Programming
Bistarelli, Stefano, Montanari, Ugo, Rossi, Francesca, Santini, Francesco
We present a formal model to represent and solve the unicast/multicast routing problem in networks with Quality of Service (QoS) requirements. To attain this, first we translate the network adapting it to a weighted graph (unicast) or and-or graph (multicast), where the weight on a connector corresponds to the multidimensional cost of sending a packet on the related network link: each component of the weights vector represents a different QoS metric value (e.g. bandwidth, cost, delay, packet loss). The second step consists in writing this graph as a program in Soft Constraint Logic Programming (SCLP): the engine of this framework is then able to find the best paths/trees by optimizing their costs and solving the constraints imposed on them (e.g. delay < 40msec), thus finding a solution to QoS routing problems. Moreover, c-semiring structures are a convenient tool to model QoS metrics. At last, we provide an implementation of the framework over scale-free networks and we suggest how the performance can be improved.
Intelligent Content Discovery on the Mobile Internet: Experiences and Lessons Learned
Smyth, Barry (University College Dublin) | Cotter, Paul (ChangingWorlds) | Oman, Stephen (ChangingWorlds)
The mobile Internet represents a massive opportunity for mobile operators and content providers. Today there are more than 2 billion mobile subscribers, with 3 billion predicted by the end of 2007. However, despite significant improvements in handsets, infrastructure, content, and charging models, mobile users are still struggling to access and locate relevant content and services. An important part of this so-called content-discovery problem relates to the navigation effort that users must invest in browsing and searching for mobile content. In this article we describe one successfully deployed solution, which uses personalization technology to profile subscriber interests in order to automatically adapt mobile portals to their learned preferences. We present summary results, from our deployment experiences with more than 40 mobile operators and millions of subscribers around the world, which demonstrate how this solution can have a significant impact on portal usability, subscriber usage, and mobile operator revenues.
Differential Entropic Clustering of Multivariate Gaussians
Davis, Jason V., Dhillon, Inderjit S.
Gaussian data is pervasive and many learning algorithms (e.g., k-means) model their inputs as a single sample drawn from a multivariate Gaussian. However, in many real-life settings, each input object is best described by multiple samples drawn from a multivariate Gaussian. Such data can arise, for example, in a movie review database where each movie is rated by several users, or in time-series domains such as sensor networks. Here, each input can be naturally described by both a mean vector and covariance matrix which parameterize the Gaussian distribution. In this paper, we consider the problem of clustering such input objects, each represented as a multivariate Gaussian. We formulate the problem using an information theoretic approach and draw several interesting theoretical connections to Bregman divergences and also Bregman matrix divergences. We evaluate our method across several domains, including synthetic data, sensor network data, and a statistical debugging application.