Goto

Collaborating Authors

Energy Preserving Neural Networks – ML Review – Medium

#artificialintelligence

In the current deep learning era, there is a neural network architecture available for possibly every problem; for instance, ResNet / CNN for computer vision problems, RNN with attention for Language / Speech problems and so on. The Deep Learning models have surpassed almost all the traditional Machine Learning techniques and are the current SOTA for various problems that were deemed to be impossible to achieve for a computer.


Multi-Party Computation on Machine Learning

#artificialintelligence

I developed a technique that lets three parties obtain the results of machine learning across non-public datasets.


Pruning Neural Networks

#artificialintelligence

Much of the success of deep learning has come from building larger and larger neural networks. This allows these models to perform better on various tasks, but also makes them more expensive to use. Larger models take more storage space which makes them harder to distribute. Larger models also take more time to run and can require more expensive hardware. This is especially a concern if you are productionizing a model for a real-world application.


Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning series)

#artificialintelligence

Gaussian processes (GPs) provide a principled, practical, probabilistic approach to learning in kernel machines. GPs have received increased attention in the machine-learning community over the past decade, and this book provides a long-needed systematic and unified treatment of theoretical and practical aspects of GPs in machine learning. The treatment is comprehensive and self-contained, targeted at researchers and students in machine learning and applied statistics.The book deals with the supervised-learning problem for both regression and classification, and includes detailed algorithms. A wide variety of covariance (kernel) functions are presented and their properties discussed. Model selection is discussed both from a Bayesian and a classical perspective.


Google Accelerates Quantum Computation with Classical Machine Learning

#artificialintelligence

Tech giant Google's recent claim regarding quantum supremacy created a buzz in the computer science community and got global mainstream media talking about quantum computing breakthroughs. Yesterday Google fed the public's growing interest in the topic with a blog post introducing a study on improving quantum computation using classical machine learning. The qubit is the most basic constituent of quantum computing, and also poses one of the most significant challenges for the realization of near-term quantum computers. Various characteristics of qubits have made it challenging to control them. Google AI explains that issues such as imperfections in the control electronics can "impact the fidelity of the computation and thus limit the applications of near-term quantum devices."