deep learning



Top 10 Free Deep Learning Massive Open Online Courses

@machinelearnbot

To compile this list, we explored deep learning MOOCs (Massive Open Online Courses) published by top universities, colleges, and leading tech companies. Dedicated to beginners, intermediate, and advanced learners, and covering most concepts of Deep Learning, from the most basic to the cutting-edge, all of these courses are free and self-paced, and some of them even offer certificates. It goes without saying that all of these courses come with some prerequisites: basic knowledge of mathematics, how to manipulate GitHub repositories, and a good command of programming languages like Python. Google has published an online course dedicated to deep learning via Udacity, the online course platform. Google's MOOC trains intermediate to advanced developers free of charge for 12 weeks on many aspects of deep learning, such as how to build and optimize deep neural networks.


Forget The Hype: What Every Business Leader Needs To Know About Artificial Intelligence Now

#artificialintelligence

"Just as 100 years ago electricity transformed industry after industry, AI will now do the same." Artificial Intelligence – it's on the lips of the leaders, and on the 2018 agendas of the board meetings, of almost every global company today. Directors and operating executives alike know, or think they know, that this "new electricity" is going to be the next transformative force of our world. To ignore it now could be fatal to their long-term competitive position, not to mention survival. AI-powered companies that know what they are doing -- primarily born in the Internet and mobile eras – have not only gained tremendous advantage in improved efficiency and increased profitability, they have literally changed the competitive landscape of successive industries.


TensorFlow for Deep Learning: From Linear Regression to Reinforcement Learning: Bharath Ramsundar, Reza Bosagh Zadeh: 9781491980453: Amazon.com: Books

@machinelearnbot

Reza Bosagh Zadeh is Founder CEO at Matroid and Adjunct Professor at Stanford University. His work focuses on Machine Learning, Distributed Computing, and Discrete Applied Mathematics. Reza received his PhD in Computational Mathematics from Stanford University under the supervision of Gunnar Carlsson. His awards include a KDD Best Paper Award and the Gene Golub Outstanding Thesis Award. He has served on the Technical Advisory Boards of Microsoft and Databricks.


Classifying and visualizing with fastText and tSNE

#artificialintelligence

Previously I wrote a three-part series on classifying text, in which I walked through the creation of a text classifier from the bottom up. It was interesting but it was purely an academic exercise. Here I'm going to use methods suitable for scaling up to large datasets, preferring tools written by others to those written by myself. The end goal is the same: classifying and visualizing relationships between blocks of text. I'm thinking of the classifier as a different representation of the block of text, so (1) and (2) are similar.


Developing the AI future

#artificialintelligence

Artificial Intelligence (AI) is starting to change how many businesses operate. The ability to accurately process and deliver data faster than any human could is already transforming how we do everything from studying diseases and understanding road traffic behaviour to managing finances and predicting weather patterns. For business leaders, AI's potential could be fundamental for future growth. With so much on offer and at stake, the question is no longer simply what AI is capable of, but where AI can best be used to deliver immediate business benefits. According to Forrester, 70% of enterprises will be implementing AI in some way over the next year.


Xavier Amatriain's Machine Learning and Artificial Intelligence Year-end Roundup

#artificialintelligence

Hard to believe that it's only been a year since I was doing the previous end-of-year round up. So much has happened in the world of AI that it is hard to fit in a couple of paragraphs. Don't expect too many details, but do expect a lot of links to follow up on them. If I have to pick my main highlight of the year, that has to go to AlphaGo Zero (paper). Not only does this new approach improve in some of the most promising directions (e.g.


The Age of AI Surveillance

#artificialintelligence

Your favorite artist is in town for what will undoubtedly be the biggest show of the year. You're at the venue's gates, ready to present your ticket for admission, when you realize you forgot it. No problem, security simply scans your face, and after the camera recognizes you, you're permitted entrance. The Chinese government is searching for a dangerous criminal among its population. A machine goes through hundreds of hours of security camera footage and identifies the right person within minutes.


Want Disruptive Change? There's An Algorithm For That (Or Soon Will Be)

#artificialintelligence

Trust me – it's not you. Our world really is more unpredictable than ever. Even the best-laid strategies are being disrupted, whether they are focused on the workplace's culture, technical environment, market dynamics, customer behavior, or business processes. But central to these uncertainties is one constant: an algorithm guiding every step along the evolutionary trail to digital transformation. "Each company has a predictable algorithm that's driving its business model," said Sathya Narasimhan, senior director for Partner Business Development at SAP, on a live episode of Coffee Break with Game Changers Radio, presented by SAP and produced and moderated by SAP's Bonnie D. Graham.


Create a Character-based Seq2Seq model using Python and Tensorflow

#artificialintelligence

In this article, I will share my findings on creating a character-based Sequence-to-Sequence model (Seq2Seq) and I will share some of the results I have found. All of this is just a tiny part of my Master Thesis and it took quite a while for me to learn how to convert the theoretical concepts into practical models. I will also share the lessons that I have learned. This blog post is about Natural Language Processing (NLP in short). It is not easy for computers to interpret texts.