learning


Multi-Layer Neural Networks with Sigmoid Function-- Deep Learning for Rookies (2)

#artificialintelligence

Back in the 1950s and 1960s, people had no effective learning algorithm for a single-layer perceptron to learn and identify non-linear patterns (remember the XOR gate problem?). Data in the input layer is labeled as x with subscripts 1, 2, 3, …, m. Neurons in the hidden layer are labeled as h with subscripts 1, 2, 3, …, n. Note for hidden layer it's n and not m, since the number of hidden layer neurons might differ from the number in input data. This is so that when you have several hidden layers, you can identify which hidden layer it is: first hidden layer has superscript 1, second hidden layer has superscript 2, and so on, like in Graph 3. Now that seems like a dating material for our neural network:) Sigmoid function, unlike step function, introduces non-linearity into our neural network model.


Is "Artificial Intelligence" Dead? Long Live Deep Learning?!?

#artificialintelligence

Has Deep Learning become synonymous with Artificial Intelligence? Read a discussion on the topic fuelled by the opinions of 7 participating experts, and gain some additional insight into the future of research and technology. Deep learning has achieved some very impressive accomplishments of late. Given these high-profile successes, one could forgive the uninitiated (be they laymen or tech-savvy individuals) for the casual confounding of terms such as "artificial intelligence" and "deep learning," among others.


Mapping the Canadian AI Ecosystem

#artificialintelligence

UPDATE June 13, 2017: Last week I posted V1 of our Map of the Canadian AI Ecosystem, and since then I've been inundated with additions. Indeed, Edmonton received $35M from Ottawa while Vancouver received none of that federal money, despite having 5x the number of startups that Edmonton has. Edmonton has a great opportunity to build their startup ecosystem before the venture funding really kicks in. But if we want to go beyond research and become big players in the AI market, research is not enough.



Facebook's AI accidentally created its own language • r/artificial

#artificialintelligence

Submissions should generally be about Artificial Intelligence and its applications. Try to avoid posting submissions that seem like a self-advertisement. The topic of Artificial Intelligence is very broad and there are many good learning resources available on the internet and in print. However, to get started with Artificial Intelligence it's enough to understand the following two books:


Using ANNs on small data – Deep Learning vs. Xgboost

@machinelearnbot

Import some keras goodness (and perhaps run pip install keras first if you need it). And to keep the checkpoint just before overfitting occurs, ModelCheckpoints let's us save the best model before decline in validation set performance. We will now do the same with an good old xgboost (conda install xgboost) with the nice sklearn api. On these datasets, training the ANN takes no time at all.


5 ways to get the latest AI news

#artificialintelligence

Putting the ART and the INTEL in Artificial Intelligence, The Visionary newsletter skips the overly insider and straightforward takes of many a tech site to focus on a curated mix of the novel and the newsworthy, with a particular emphasis on the computer vision side of AI -- you know, the cool stuff like self-driving cars, robots, and augmented reality -- and a small dose of pop culture. Who it's for: Anyone scared off by all the ones and zeroes and math that inevitably come into any discussions of AI, as well as the those interested in the more visual aspects of AI (computer vision, augmented reality) and how they connect to our daily life. Formerly known as Technically Sentient, a cool name we hope sees the light of day in another form down the road, this weekly newsletter is published by Inside, a newsletter publisher known for its eclectic range of deep dives (e.g., Inside Automotive, Inside Streaming). Curated by computer science engineer Denny Britz, WildML, also known as The Wild Week in AI, kicks off with a "TLDR" (too long didn't read) summary of its contents.


TensorFlow Basics -- TensorFlow for Hackers (Part I)

@machinelearnbot

It's used mainly for machine learning (especially deep learning) tasks. Let's check that we can import TensorFlow. For our example, let's find out how eating burgers affect your resting heart rate. Let's use our optimizer for 300 steps of learning Let's get the final and best predictions for a and b Let's compare the predicted and actual values for y: Those two lines overlap pretty good, what did you expect?


Control of Memory, Active Perception, and Action in Minecraft - Junhyuk Oh • r/artificial

#artificialintelligence

Submissions should generally be about Artificial Intelligence and its applications. Try to avoid posting submissions that seem like a self-advertisement. The topic of Artificial Intelligence is very broad and there are many good learning resources available on the internet and in print. However, to get started with Artificial Intelligence it's enough to understand the following two books:


You can probably use deep learning even if your data isn't that big

@machinelearnbot

Deep learning models are complex and tricky to train, and I had a hunch that lack of model convergence/difficulties training probably explained the poor performance, not overfitting. We recreated python versions of the Leekasso and MLP used in the original post to the best of our ability, and the code is available here. The MLP used in the original analysis still looks pretty bad for small sample sizes, but our neural nets get essentially perfect accuracy for all sample sizes. A lot of parameters are problem specific (especially the parameters related to SGD) and poor choices will result in misleadingly bad performance.