Manifold Optimisation Assisted Gaussian Variational Approximation

arXiv.org Machine Learning

Variational approximation methods are a way to approximate the posterior in Bayesian inference especially when the dataset has a large volume or high dimension. Factor covariance structure was introduced in previous work with three restrictions to handle the problem of computational infeasibility in Gaussian approximation. However, the three strong constraints on the covariance matrix could possibly break down during the process of the structure optimization, and the identification issue could still possibly exist within the final approximation. In this paper, we consider two types of manifold parameterization, Stiefel manifold and Grassmann manifold, to address the problems. Moreover, the Riemannian stochastic gradient descent method is applied to solve the resulting optimization problem while maintaining the orthogonal factors. Results from two experiments demonstrate that our model fixes the potential issue of the previous method with comparable accuracy and competitive converge speed even in high-dimensional problems.


UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction

arXiv.org Machine Learning

UMAP (Uniform Manifold Approximation and Projection) is a novel manifold learning technique for dimension reduction. UMAP is constructed from a theoretical framework based in Riemannian geometry and algebraic topology. The result is a practical scalable algorithm that applies to real world data. The UMAP algorithm is competitive with t-SNE for visualization quality, and arguably preserves more of the global structure with superior run time performance. Furthermore, UMAP as described has no computational restrictions on embedding dimension, making it viable as a general purpose dimension reduction technique for machine learning.


Non-parametric Regression Between Manifolds

Neural Information Processing Systems

This learning problem arises frequently in many application areas ranging from signal processing, computer vision, over robotics to computer graphics. We present a new algorithmic scheme for the solution of this general learning problem based on regularized empirical risk minimization. The regularization functional takes into account the geometry of input and output manifold, and we show that it implements a prior which is particularly natural. Moreover, we demonstrate that our algorithm performs well in a difficult surface registration problem.


Classification via local manifold approximation

arXiv.org Machine Learning

Classifiers label data as belonging to one of a set of groups based on input features. It is challenging to obtain accurate classification performance when the feature distributions in the different classes are complex, with nonlinear, overlapping and intersecting supports. This is particularly true when training data are limited. To address this problem, this article proposes a new type of classifier based on obtaining a local approximation to the support of the data within each class in a neighborhood of the feature to be classified, and assigning the feature to the class having the closest support. This general algorithm is referred to as LOcal Manifold Approximation (LOMA) classification. As a simple and theoretically supported special case having excellent performance in a broad variety of examples, we use spheres for local approximation, obtaining a SPherical Approximation (SPA) classifier. We illustrate substantial gains for SPA over competitors on a variety of challenging simulated and real data examples.


Manifold Representations for Value-Function Approximation

AAAI Conferences

Reinforcement learning (RL) has been shown to be an effective paradigm for learning control policies for problems with discrete state spaces. For problems with continuous multidimensional state spaces, the results are less compelling. When these state spaces can be effectively discretized, traditional techniques can be applied. However, many interesting problems must be discretized into an infeasibly large number of states. In these cases, other techniques must be used.