Goto

Collaborating Authors

 rt 1


Online Statistical Inference of Constrained Stochastic Optimization via Random Scaling

Du, Xinchen, Zhu, Wanrong, Wu, Wei Biao, Na, Sen

arXiv.org Machine Learning

Constrained stochastic nonlinear optimization problems have attracted significant attention for their ability to model complex real-world scenarios in physics, economics, and biology. As datasets continue to grow, online inference methods have become crucial for enabling real-time decision-making without the need to store historical data. In this work, we develop an online inference procedure for constrained stochastic optimization by leveraging a method called Sketched Stochastic Sequential Quadratic Programming (SSQP). As a direct generalization of sketched Newton methods, SSQP approximates the objective with a quadratic model and the constraints with a linear model at each step, then applies a sketching solver to inexactly solve the resulting subproblem. Building on this design, we propose a new online inference procedure called random scaling. In particular, we construct a test statistic based on SSQP iterates whose limiting distribution is free of any unknown parameters. Compared to existing online inference procedures, our approach offers two key advantages: (i) it enables the construction of asymptotically valid confidence intervals; and (ii) it is matrix-free, i.e. the computation involves only primal-dual SSQP iterates $(\boldsymbol{x}_t, \boldsymbolλ_t)$ without requiring any matrix inversions. We validate our theory through numerical experiments on nonlinearly constrained regression problems and demonstrate the superior performance of our random scaling method over existing inference procedures.


Gen-Oja: A Simple and Efficient Algorithm for Streaming Generalized Eigenvector Computation

Bhatia, Kush, Pacchiano, Aldo, Flammarion, Nicolas, Bartlett, Peter L., Jordan, Michael I.

arXiv.org Machine Learning

In this paper, we study the problems of principal Generalized Eigenvector computation and Canonical Correlation Analysis in the stochastic setting. We propose a simple and efficient algorithm, Gen-Oja, for these problems. We prove the global convergence of our algorithm, borrowing ideas from the theory of fast-mixing Markov chains and two-time-scale stochastic approximation, showing that it achieves the optimal rate of convergence. In the process, we develop tools for understanding stochastic processes with Markovian noise which might be of independent interest.


On Referring Expressions in Query Answering over First Order Knowledge Bases

Borgida, Alexander (Rutgers University) | Toman, David (University of Waterloo) | Weddell, Grant (University of Waterloo)

AAAI Conferences

A referring expression in linguistics is any noun phrase identifying an object in a way that will be useful to interlocutors. In the context of a query over a first order knowledge base K, constant symbols occurring in K are the artifacts usually used as referring expressions in certain answers to the query. In this paper, we begin to explore how this can be usefully extended by allowing a class of more general formulas, called Singular Referring Expressions, to replace constants in this role. In particular, we lay a foundation for admitting Singular Referring Expressions in certain answer computation for queries over K. An integral part of this foundation are characterization theorems for identification properties of Singular Referring Expressions for queries annotated with a domain specific language for referring concept types. Finally, we apply this framework in the context of tractable description logic dialects, showing how identification properties can be determined at compile-time for conjunctive queries, and how off-the-shelf conjunctive query evaluation for these dialects can be used in query evaluations, preserving, in all cases, underlying tractability.