Goto

Collaborating Authors

REM: From Structural Entropy to Community Structure Deception

Neural Information Processing Systems

This paper focuses on the privacy risks of disclosing the community structure in an online social network. This raises the problem of community structure deception (CSD), which asks for ways to minimally modify the network so that a given community structure maximally hides itself from community detection algorithms. We investigate CSD through an information-theoretic lens. To this end, we propose a community-based structural entropy to express the amount of information revealed by a community structure. This notion allows us to devise residual entropy minimization (REM) as an efficient procedure to solve CSD.


Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

Neural Information Processing Systems

We present simple and computationally efficient nonparametric estimators of R\'enyi entropy and mutual information based on an i.i.d. sample drawn from an unknown, absolutely continuous distribution over $\R^d$. The estimators are calculated as the sum of $p$-th powers of the Euclidean lengths of the edges of the `generalized nearest-neighbor' graph of the sample and the empirical copula of the sample respectively. For the first time, we prove the almost sure consistency of these estimators and upper bounds on their rates of convergence, the latter of which under the assumption that the density underlying the sample is Lipschitz continuous. Experiments demonstrate their usefulness in independent subspace analysis.


Graph Entropy Guided Node Embedding Dimension Selection for Graph Neural Networks

arXiv.org Artificial Intelligence

Graph representation learning has achieved great success in many areas, including e-commerce, chemistry, biology, etc. However, the fundamental problem of choosing the appropriate dimension of node embedding for a given graph still remains unsolved. The commonly used strategies for Node Embedding Dimension Selection (NEDS) based on grid search or empirical knowledge suffer from heavy computation and poor model performance. In this paper, we revisit NEDS from the perspective of minimum entropy principle. Subsequently, we propose a novel Minimum Graph Entropy (MinGE) algorithm for NEDS with graph data. To be specific, MinGE considers both feature entropy and structure entropy on graphs, which are carefully designed according to the characteristics of the rich information in them. The feature entropy, which assumes the embeddings of adjacent nodes to be more similar, connects node features and link topology on graphs. The structure entropy takes the normalized degree as basic unit to further measure the higher-order structure of graphs. Based on them, we design MinGE to directly calculate the ideal node embedding dimension for any graph. Finally, comprehensive experiments with popular Graph Neural Networks (GNNs) on benchmark datasets demonstrate the effectiveness and generalizability of our proposed MinGE.


The Computational Theory of Intelligence: Information Entropy

arXiv.org Artificial Intelligence

This paper presents an information theoretic approach to the concept of intelligence in the computational sense. We introduce a probabilistic framework from which computational intelligence is shown to be an entropy minimizing process at the local level. Using this new scheme, we develop a simple data driven clustering example and discuss its applications.


Making entropy production work

#artificialintelligence

While Rolf Landauer was working at IBM in the early 1960s, he had a startling insight about how heat, entropy, and information were connected. Landauer realized that manipulating information releases heat and increases entropy, or the disorder of the environment. He used this to calculate a theoretical lower limit for heat released from a computation, such as erasing a bit. At room temperature, the limit is about 10-21, or one billionth of a trillionth of a joule. But Landauer's limit is, ironically, limited because of how general it is.