Theoretical Comparisons of Positive-Unlabeled Learning against Positive-Negative Learning

Neural Information Processing Systems

In PU learning, a binary classifier is trained from positive (P) and unlabeled (U) data without negative (N) data. Although N data is missing, it sometimes outperforms PN learning (i.e., ordinary supervised learning). Hitherto, neither theoretical nor experimental analysis has been given to explain this phenomenon. In this paper, we theoretically compare PU (and NU) learning against PN learning based on the upper bounds on estimation errors. We find simple conditions when PU and NU learning are likely to outperform PN learning, and we prove that, in terms of the upper bounds, either PU or NU learning (depending on the class-prior probability and the sizes of P and N data) given infinite U data will improve on PN learning.


A Comprehensive Learning Path for Deep Learning in 2020

#artificialintelligence

What a time to be working in the deep learning space! Deep learning is ubiquitous right now. From the top research labs in the world to startups looking to design solutions, deep learning is at the heart of the current technological revolution. We are living in a deep learning wonderland! Whether it's Computer Vision applications or breakthroughs in the field of Natural Language Processing (NLP), organizations are looking for a piece of the deep learning pie.


Top 10 deep learning Framesworks everyone should know

#artificialintelligence

This is the age of artificial intelligence. Machine Learning and predictive analytics are now established and integral to just about every modern businesses, but artificial intelligence expands the scale of what's possible within those fields. It's what makes deep learning possible. Systems with greater ostensible autonomy and complexity can solve similarly complex problems.


Machine learning concepts: styles of machine learning

#artificialintelligence

This is the first in a series of posts about machine learning concepts, where we'll cover everything from learning styles to new dimensions in machine learning research. What makes machine learning so successful? The answer lies in the core concept of machine learning: a machine can learn from examples and experience. Before machine learning, machines were programmed with specific instructions and had no need to learn on their own. A machine (without machine learning) is born knowing exactly what it's supposed to do and how to do it, like a robot arm on an assembly line.


[ICML2016] Ask a Workshop Anything: Deep Learning Workshop Session 2: Simulation-based Learning • /r/MachineLearning

@machinelearnbot

I'm very excited to announce /r/MachineLearning is trying a new AMA format in collaboration with the organizers of the Deep Learning Workshop at ICML 2016: In this year's ICML Deep Learning Workshop, we depart from previous years' formats and experiment with a completely new format. The workshop will be split into two sessions, each consisting of a set of invited talks followed by a panel discussion. By organizing the workshop in this manner we aim to promote focused discussions that dive deep into important areas and also increase interaction between speakers and the audience. The second (afternoon) session of the workshop aims at answering the question "What does simulation-based learning bring to the table?" Under this broad theme, more specific questions may include "How transferrable is the knowledge learned from a simulation to the real world?",