Accelerated primal-dual methods with enlarged step sizes and operator learning for nonsmooth optimal control problems
Song, Yongcun, Yuan, Xiaoming, Yue, Hangrui
–arXiv.org Artificial Intelligence
We consider a general class of nonsmooth optimal control problems with partial differential equation (PDE) constraints, which are very challenging due to their nonsmooth objective functionals and the resulting high-dimensional and ill-conditioned systems after discretization. We focus on the application of a primal-dual method, with which different types of variables can be treated individually in iterations and thus its main computation at each iteration only requires solving two PDEs. Our target is to accelerate the primal-dual method with either enlarged step sizes or operator learning techniques. The accelerated primal-dual method with enlarged step sizes improves the numerical performance of the original primal-dual method in a simple and universal way, while its convergence can be still proved rigorously. For the operator learning acceleration, we construct deep neural network surrogate models for the involved PDEs. Once a neural operator is learned, solving a PDE requires only a forward pass of the neural network, and the computational cost is thus substantially reduced. The accelerated primal-dual method with operator learning is mesh-free, numerically efficient, and scalable to different types of PDEs. The acceleration effectiveness of these two techniques is promisingly validated by some preliminary numerical results.
arXiv.org Artificial Intelligence
Jul-25-2023