Global Optimization with A Power-Transformed Objective and Gaussian Smoothing
–arXiv.org Artificial Intelligence
We propose a novel method that solves global optimization problems in two steps: (1) perform a (exponential) power-$N$ transformation to the not-necessarily differentiable objective function $f$ and get $f_N$, and (2) optimize the Gaussian-smoothed $f_N$ with stochastic approximations. Under mild conditions on $f$, for any $\delta>0$, we prove that with a sufficiently large power $N_\delta$, this method converges to a solution in the $\delta$-neighborhood of $f$'s global optimum point. The convergence rate is $O(d^2\sigma^4\varepsilon^{-2})$, which is faster than both the standard and single-loop homotopy methods if $\sigma$ is pre-selected to be in $(0,1)$. In most of the experiments performed, our method produces better solutions than other algorithms that also apply smoothing techniques.
arXiv.org Artificial Intelligence
Dec-23-2024
- Country:
- Asia > China
- Guangdong Province > Shenzhen (0.04)
- North America
- Canada > Ontario
- Toronto (0.14)
- United States
- Illinois (0.04)
- Pennsylvania > Philadelphia County
- Philadelphia (0.04)
- Canada > Ontario
- Asia > China
- Genre:
- Research Report > Promising Solution (0.54)
- Technology: