Goto

Collaborating Authors

 krylov



LazySVD: Even Faster SVD Decomposition Yet Without Agonizing Pain

Neural Information Processing Systems

We study k-SVD that is to obtain the first k singular vectors of a matrix A. Recently, a few breakthroughs have been discovered on k-SVD: Musco and Musco [19] proved the first gap-free convergence result using the block Krylov method, Shamir [21] discovered the first variance-reduction stochastic method, and Bhojanapalli et al. [7] provided the fastest O(nnz(A) + poly(1/ε))-time algorithm using alternating minimization. In this paper, we put forward a new and simple LazySVD framework to improve the above breakthroughs. This framework leads to a faster gap-free method outperforming [19], and the first accelerated and stochastic method outperforming [21]. In the O(nnz(A) + poly(1/ε)) running-time regime, LazySVD outperforms [7] in certain parameter regimes without even using alternating minimization.


Polynomial Preconditioning for Gradient Methods

Doikov, Nikita, Rodomanov, Anton

arXiv.org Artificial Intelligence

We study first-order methods with preconditioning for solving structured nonlinear convex optimization problems. We propose a new family of preconditioners generated by symmetric polynomials. They provide first-order optimization methods with a provable improvement of the condition number, cutting the gaps between highest eigenvalues, without explicit knowledge of the actual spectrum. We give a stochastic interpretation of this preconditioning in terms of coordinate volume sampling and compare it with other classical approaches, including the Chebyshev polynomials. We show how to incorporate a polynomial preconditioning into the Gradient and Fast Gradient Methods and establish the corresponding global complexity bounds. Finally, we propose a simple adaptive search procedure that automatically chooses the best possible polynomial preconditioning for the Gradient Method, minimizing the objective along a low-dimensional Krylov subspace. Numerical experiments confirm the efficiency of our preconditioning strategies for solving various machine learning problems.


eBay CTO: AI is now an 'ecosystem' for us

#artificialintelligence

While eBay could have used any number of existing AI platforms to enhance its various products, the company instead elected to build its own AI system -- dubbed Krylov -- in-house and make it open source for anyone to use. That decision appears to be paying off. The San Jose-based company has made no secret of its AI ambitions over the past four years, hoovering up technical talent via acquisitions and launching myriad automated tools to spot credit card fraud, improve product listings, and bring other nifty features to buyers and sellers. Late last year, eBay also relaunched a standalone vehicle-focused Motors app, with AI and automation at its core. Such is the pervasiveness of AI across eBay in 2020 that the company's chief technology officer, Mazen Rawashdeh, says AI is now an "ecosystem" at the company.


Why eBay believes in open-sourcing Krylov, its AI platform

#artificialintelligence

It's hard to find a tech company that isn't attempting some sort of AI-related product, service, or initiative these days, but eBay went all-in by building its own AI platform, called Krylov. Sanjeev Katariya, eBay's VP and chief architect of AI and platforms, described Krylov in an interview with VentureBeat: "At the very highest level, Krylov is a machine learning platform that enables data scientists and machine learning engineers to ship all different kinds of models for all kinds of data quickly into production, which gets integrated into user experiences that eBay ships globally." It's a multi-tenant, cloud-based platform that involves technologies like computer vision and natural language processing (NLP), techniques including distributed training and hyper-parameter tuning, and tools germane to eBay's services, like merchandising recommendations, buyer personalization, seller price guidance, and shipping estimates. Even if they did, the hard costs -- however significant they may or may not be -- wouldn't fully capture what eBay has invested to build the platform over years of internal organizational efforts around the globe. And after all that, eBay is now open-sourcing Krylov.


New eBay platform using AI to enable image search and internal innovation

#artificialintelligence

Many of the biggest tech companies like Google, Facebook and Amazon have realized the value of creating their own AI platforms for both internal and customer-facing services. Facebook's FBLearner Flow helps the social media site filter out offensive posts, while Uber's Michelangelo gives users time predictions for food deliveries. To keep up with the competition, eBay has unveiled its AI platform, Krylov, which has given the company a wide range of new capabilities from improved language translation services to searching with images. In a blog post, eBay's Sanjeev Katariya, vice president and chief architect of the eBay AI and platforms, and Ashok Ramani, director of product management, computer vision, natural and language processing, discussed the creation of Krylov and how it has changed things both inside eBay and for users of the site. "With computer vision powered by eBay's modern AI platform, the technology helps you find items based on the click of your camera or an image. Users can go onto the eBay app and take a photo of what they are looking for and within milliseconds, the platform surfaces items that match the image," Katariya and Ramani wrote in December.


LazySVD: Even Faster SVD Decomposition Yet Without Agonizing Pain

Allen-Zhu, Zeyuan, Li, Yuanzhi

Neural Information Processing Systems

We study k-SVD that is to obtain the first k singular vectors of a matrix A. Recently, a few breakthroughs have been discovered on k-SVD: Musco and Musco [1] proved the first gap-free convergence result using the block Krylov method, Shamir [2] discovered the first variance-reduction stochastic method, and Bhojanapalli et al. [3] provided the fastest $O(\mathsf{nnz}(A) + \mathsf{poly}(1/\varepsilon))$-time algorithm using alternating minimization. In this paper, we put forward a new and simple LazySVD framework to improve the above breakthroughs. This framework leads to a faster gap-free method outperforming [1], and the first accelerated and stochastic method outperforming [2]. In the $O(\mathsf{nnz}(A) + \mathsf{poly}(1/\varepsilon))$ running-time regime, LazySVD outperforms [3] in certain parameter regimes without even using alternating minimization.