A Gunsilius's Algorithm

Neural Information Processing Systems 

Gunsilius (2020) provides a theoretical framework for minimal conditions for a continuous IV model to imply non-trivial bounds (that is, bounds tighter that what can be obtained by just assuming that the density function p(x, y | z) exists). That work also introduces two variations of an algorithm for fitting bounds. The final distribution is reweighted combination of the pre-sampled l response functions with weights µ playing the role of the decision variables to be optimized. Hence, by construction, the space of distributions in the response function space is absolutely continuous with respect to the pre-defined Gaussian process. Large deviance bounds are then used to show the (intuitive) result that this approximation is a probably approximately correct formulation of the original optimization problem. One issue with this algorithm is that l may be required to be large as it is a non-adaptive Monte Carlo approximation in a high dimensional space. A variant is described where, every time a solution for µ is found, response function samples with low corresponding values of µ are replaced (again, from the given and non-adaptive Gaussian process).