Goto

Collaborating Authors

 neighbourhood





Graph Attention Network for Node Regression on Random Geometric Graphs with Erdős--Rényi contamination

Laha, Somak, Liu, Suqi, Austern, Morgane

arXiv.org Machine Learning

Graph attention networks (GATs) are widely used and often appear robust to noise in node covariates and edges, yet rigorous statistical guarantees demonstrating a provable advantage of GATs over non-attention graph neural networks~(GNNs) are scarce. We partially address this gap for node regression with graph-based errors-in-variables models under simultaneous covariate and edge corruption: responses are generated from latent node-level covariates, but only noise-perturbed versions of the latent covariates are observed; and the sample graph is a random geometric graph created from the node covariates but contaminated by independent Erdős--Rényi edges. We propose and analyze a carefully designed, task-specific GAT that constructs denoised proxy features for regression. We prove that regressing the response variables on the proxies achieves lower error asymptotically in (a) estimating the regression coefficient compared to the ordinary least squares (OLS) estimator on the noisy node covariates, and (b) predicting the response for an unlabelled node compared to a vanilla graph convolutional network~(GCN) -- under mild growth conditions. Our analysis leverages high-dimensional geometric tail bounds and concentration for neighbourhood counts and sample covariances. We verify our theoretical findings through experiments on synthetically generated data. We also perform experiments on real-world graphs and demonstrate the effectiveness of the attention mechanism in several node regression tasks.


The video games you may have missed in 2025

The Guardian

Date a vending machine, watch intergalactic television and make the most out of your short existence as a fly. Here are the best games you weren't playing this year The 20 best video games of 2025 More on the best culture of 2025 Have you ever wanted to romance your record player? Date Everything! offers players the chance to develop relationships with everyday objects around your house, in a fully voiced sandbox romp featuring over 100 anthropomorphised characters. Wonderfully meta; you can put the moves on the textbox, or even "Michael Transaction" (microtransaction - get it?) A raucous debut by indie studio à la mode games, Sorry We're Closed is a survival horror where the monster is love and the dungeon is a dingy London neighbourhood.


Argumentative Debates for Transparent Bias Detection [Technical Report]

Ayoobi, Hamed, Potyka, Nico, Rapberger, Anna, Toni, Francesca

arXiv.org Artificial Intelligence

As the use of AI in society grows, addressing emerging biases is essential to prevent systematic discrimination. Several bias detection methods have been proposed, but, with few exceptions, these tend to ignore transparency. Instead, interpretability and explainability are core requirements for algorithmic fairness, even more so than for other algorithmic solutions, given the human-oriented nature of fairness. We present ABIDE (Argumentative BIas detection by DEbate), a novel framework that structures bias detection transparently as debate, guided by an underlying argument graph as understood in (formal and computational) argumentation. The arguments are about the success chances of groups in local neighbourhoods and the significance of these neighbourhoods. We evaluate ABIDE experimentally and demonstrate its strengths in performance against an argumentative baseline.


adversarial examples, and demonstrate that there could be infinitely many such examples lying in convex polytopes

Neural Information Processing Systems

We thank the reviewers for their feedback. All reviewers found the paper "interesting", and various reviewers commented that "the phenomenon identified here This can be also used to guide architecture selection. Thanks for the suggestion - the findings do hold approximately for any activation function that saturates, i.e. sigmoid, We will comment on this in the camera-ready. Thank you for your suggestion! For Sect. 4.1, the softmax is over just over the



Reviewer 1: " the statement in line 153 in the neighbourhood of z nullJ i (z), f (x)null = 0. "

Neural Information Processing Systems

We are grateful to the reviewers for the insightful comments on our submission. All the minor comments will also be addressed in the revised manuscript. We will update line 153 to " The domain of z can be easily adjusted by translation and dilation after the training process. Reviewer 1: "emphasize the need for gradient evaluations when you state the observation." " .......The first and fourth columns show the relationship between the output and NN is very efficient compared to the evaluating the FEM model in Case (ii).