Goto

Collaborating Authors

Adaptive Quantile Low-Rank Matrix Factorization

arXiv.org Machine Learning

Low-rank matrix factorization (LRMF) has received much popularity owing to its successful applications in both computer vision and data mining. By assuming the noise term to come from a Gaussian, Laplace or a mixture of Gaussian distributions, significant efforts have been made on optimizing the (weighted) $L_1$ or $L_2$-norm loss between an observed matrix and its bilinear factorization. However, the type of noise distribution is generally unknown in real applications and inappropriate assumptions will inevitably deteriorate the behavior of LRMF. On the other hand, real data are often corrupted by skew rather than symmetric noise. To tackle this problem, this paper presents a novel LRMF model called AQ-LRMF by modeling noise with a mixture of asymmetric Laplace distributions. An efficient algorithm based on the expectation-maximization (EM) algorithm is also offered to estimate the parameters involved in AQ-LRMF. The AQ-LRMF model possesses the advantage that it can approximate noise well no matter whether the real noise is symmetric or skew. The core idea of AQ-LRMF lies in solving a weighted $L_1$ problem with weights being learned from data. The experiments conducted with synthetic and real datasets show that AQ-LRMF outperforms several state-of-the-art techniques. Furthermore, AQ-LRMF also has the superiority over the other algorithms that it can capture local structural information contained in real images.


Laplace Matching for fast Approximate Inference in Generalized Linear Models

arXiv.org Machine Learning

Bayesian inference in generalized linear models (GLMs), i.e.~Gaussian regression with non-Gaussian likelihoods, is generally non-analytic and requires computationally expensive approximations, such as sampling or variational inference. We propose an approximate inference framework primarily designed to be computationally cheap while still achieving high approximation quality. The concept, which we call \emph{Laplace Matching}, involves closed-form, approximate, bi-directional transformations between the parameter spaces of exponential families. These are constructed from Laplace approximations under custom-designed basis transformations. The mappings can then be leveraged to effectively turn a latent Gaussian distribution into a conjugate prior for a rich class of observable variables. This effectively turns inference in GLMs into conjugate inference (with small approximation errors). We empirically evaluate the method in two different GLMs, showing approximation quality comparable to state-of-the-art approximate inference techniques at a drastic reduction in computational cost. More specifically, our method has a cost comparable to the \emph{very first} step of the iterative optimization usually employed in standard GLM inference.


Colorado fly fisherman rides ice down river in TikTok video

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. A fly fisherman in Colorado was filmed taking quite a ride on the Arkansas River -- but it wasn't on a boat. Instead, Cade Peirce was standing -- and fishing -- on a chunk of ice as it floated down the river. His wife, Morgan Peirce, posted the video on TikTok last week.


Winklevoss twins' Bitcoin ETF got rejected a second time, but...

Mashable

The SEC has rejected a second proposal to list and trade shares of the Winklevoss Bitcoin Trust on the Bats BZX Exchange, which would essentially be the launch of the first Bitcoin ETF. Cameron and Tyler Winklevoss, founders of the Gemini cryptocurrency exchange and big proponents of Bitcoin, have already been rejected in March 2017. The SEC dismissed the amended proposal on Thursday with a 3 to 1 vote, disproving BZX's claim that Bitcoin markets are "uniquely resistant to manipulation," and questioning whether BZX can do enough to deter fraud and manipulation on the market. SEE ALSO: SEC comments about a proposed bitcoin ETF are'liiiiiiiiiit' Following the SEC's decision, which sharply drove the price of Bitcoin down from $8,287 to about $7,900, the SEC published Commissioner Hester Peirce's dissent from the SEC's decision. "Contrary to the Commission's determination, I believe that the proposed rule change satisfies the statutory standard and that we should permit BZX to list and trade this bitcoin-based exchange-traded product ("ETP")," Peirce wrote.


Asymmetric Distributions from Constrained Mixtures

arXiv.org Machine Learning

This paper introduces constrained mixtures for continuous distributions, characterized by a mixture of distributions where each distribution has a shape similar to the base distribution and disjoint domains. This new concept is used to create generalized asymmetric versions of the Laplace and normal distributions, which are shown to define exponential families, with known conjugate priors, and to have maximum likelihood estimates for the original parameters, with known closed-form expressions. The asymmetric and symmetric normal distributions are compared in a linear regression example, showing that the asymmetric version performs at least as well as the symmetric one, and in a real world time-series problem, where a hidden Markov model is used to fit a stock index, indicating that the asymmetric version provides higher likelihood and may learn distribution models over states and transition distributions with considerably less entropy.