Goto

Collaborating Authors

 nullx




A Unified Discretization Framework for Differential Equation Approach with Lyapunov Arguments for Convex Optimization

Neural Information Processing Systems

The differential equation (DE) approach for convex optimization, which relates optimization methods to specific continuous DEs with rate-revealing Lyapunov functionals, has gained increasing interest since the seminal paper by Su-Boyd-Candès (2014).



A List of definitions and notations

Neural Information Processing Systems

For the convenience of the reader, we summarize a list of notations blow. 1. null G In Appendix B.1, we present a general statement of Theorem 3.1 (a) along with its proof. Theorem 3.1 (a) states the order recovery guarantee for a specified parameter We summarize the bounds for (I) and (II) in Lemma B.1 and Lemma B.2, which can be found in Collecting the results in Lemma B.1 and Lemma B.2 and reorganizing the terms in the inequalities, we have the following conclusion. We now state the proof of this Lemma. Then we bound the first term using the concentration bound on Chi-squared random variables. For the non-identifiable models, we can use Lemma H.1 in a similar way to obtain that with probability We now state the proof of this Lemma.


Impure Simplicial Complex and Term-Modal Logic with Assignment Operators

Yang, Yuanzhe

arXiv.org Artificial Intelligence

Impure simplicial complexes are a powerful tool to model multi-agent epistemic situations where agents may die, but it is difficult to define a satisfactory semantics for the ordinary propositional modal language on such models, since many conceptually dubious expressions involving dead agents can be expressed in this language. In this paper, we introduce a term-modal language with assignment operators, in which such conceptually dubious expressions are syntactically excluded. We define both simplicial semantics and first-order Kripke semantics for this language, characterize their respective expressivity through notions of bisimulation, and show that the two semantics are equivalent when we consider a special class of first order Kripke models called local epistemic models. We also offer a complete axiomatization for the epistemic logic based on this language, and show that our language has a notion of assignment normal form. Finally, we discuss the behavior of a kind of intensional distributed knowledge that can be naturally expressed in our language.





A Unified Convergence Analysis for Semi-Decentralized Learning: Sampled-to-Sampled vs. Sampled-to-All Communication

Rodio, Angelo, Neglia, Giovanni, Chen, Zheng, Larsson, Erik G.

arXiv.org Artificial Intelligence

In semi-decentralized federated learning, devices primarily rely on device-to-device communication but occasionally interact with a central server. Periodically, a sampled subset of devices uploads their local models to the server, which computes an aggregate model. The server can then either (i) share this aggregate model only with the sampled clients (sampled-to-sampled, S2S) or (ii) broadcast it to all clients (sampled-to-all, S2A). Despite their practical significance, a rigorous theoretical and empirical comparison of these two strategies remains absent. We address this gap by analyzing S2S and S2A within a unified convergence framework that accounts for key system parameters: sampling rate, server aggregation frequency, and network connectivity. Our results--both analytical and experimental--reveal distinct regimes where one strategy outperforms the other, depending primarily on the degree of data heterogeneity across devices. These insights lead to concrete design guidelines for practical semi-decentralized FL deployments.