Goto

Collaborating Authors

How Neural Networks Work- Simply Explained

#artificialintelligence

Have you ever wondered how large machine learning libraries such as Keras and Tensorflow use neural networks? Discover an Algorithmic approach to Artificial Intelligence in this video by learning how to make a Neural Networks by scratch.


Best Resources for Getting Started With Generative Adversarial Networks (GANs)

#artificialintelligence

Generative Adversarial Networks, or GANs, are a type of deep learning technique for generative modeling. GANs are the techniques behind the startlingly photorealistic generation of human faces, as well as impressive image translation tasks such as photo colorization, face de-aging, super-resolution, and more. It can be very challenging to get started with GANs. This is both because the field is very young, starting with the first paper in 2014, and because of the vast number of papers and applications published every month on the topic. In this post, you will discover the best resources that you can use to learn about generative adversarial networks.


Yann LeCun, "Predictive Learning: The Next Frontier in AI" - Bell Labs

#artificialintelligence

Yann LeCun is director of Artificial Intelligence Research at Facebook and Silver Professor of Data Science, Computer Science, Neural Science, and Electrical Engineering at New York University. He began his career at Bell Labs in 1988, where he developed theory around neural networks. His handwriting recognition tools are used for automated check processing across much of the financial industry. His further work in convolutional neural networks has revolutionized the fields of image analysis, speech recognition, and language translation. LeCun is now taking the next steps in AI, giving machines "common sense" so that they can use predictive models to train themselves.


Call For Contributions Design Automation Conference

#artificialintelligence

Machine learning and artificial intelligence (ML/AI) topic highlights advances in the field with a focus on design automation at the cross section between ML/AI algorithms and hardware (AI/ML System Design, Approximate Computing for AI/ML) While artificial intelligence and artificial neural network research has been ongoing for more than half a century, recent advances in accelerating the pace and scale of machine learning enabled by tensor-flow based gradient optimization in deeply layered convolutional networks (convnets) are revolutionizing the impact of artificial intelligence on every aspect of our daily lives, ranging from smart consumer electronics and services to self-navigating cars and personalized medicine. These advances in deep learning are fueled by computing architectures tailored to the distributed nature of learning and inference in neural networks, akin to the distributed nature of neural information processing and synaptic plasticity in the biological brain. Neuromorphic brain-inspired electronics for ML/AI aim at porting the brain's efficacy, efficiency, and resilience to noise and variability to electronic equivalents in standard CMOS and emerging technologies, offering new design challenges and opportunities to advance computing architecture beyond Moore's law scaling limits. The ML/AI sessions at DAC will highlight the fundamentals, accomplishments to date, and challenges ahead in ML/AI hardware system design and design automation, providing a forum for researchers and practitioners across all the widely varying disciplines involved to connect, engage, and join in shaping the future of this exciting field.


A Primer on Deep Learning - DataRobot

#artificialintelligence

Deep learning has been all over the news lately. In a presentation I gave at Boston Data Festival 2013 and at a recent PyData Boston meetup I provided some history of the method and a sense of what it is being used for presently. This post aims to cover the first half of that presentation, focusing on the question of why we have been hearing so much about deep learning lately. The content is aimed at data scientists who might have heard a little about deep learning and are interested in a bit more context. Regardless of your background, hopefully you will see how deep learning might be relevant for you.