neural network


Predicting Portland Home Prices

#artificialintelligence

Predicting Portland home prices allowed me to do this because I was able to incorporate various web scraping techniques, natural language processing on text, deep learning models on images, and gradient boosting into tackling the problem. The Zillow metadata contained the descriptors you would expect - square footage, neighborhood, year built, etc. Okay, now that I was confident that my image model was doing a good job, I was ready to combine the Zillow metadata, realtor description word matrix, and the image feature matrix into one matrix and then implement gradient boosting in order to predict home prices. Incorporating the images into my model immediately dropped that error by $20 K. Adding in the realtor description to that dropped it by another $10 K. Finally, adding in the Zillow metadata lowered the mean absolute error to approximately $71 K. Perhaps you are wondering how well the Zillow metadata alone would do in predicting home prices?


How Facebook Is Using Artificial Intelligence

@machinelearnbot

With a vision that "artificial intelligence can play a big role in helping bring the world closer together," Facebook has opened a new AI research lab in Montreal as part of Facebook AI Research (FAIR). In addition, Facebook announced a $7 million in AI support for Canadian Institute for Advanced Research (CIFAR), the Montreal Institute for Learning Algorithms (MILA), McGill University, and Université de Montréal (over 5 years). The software giant even allocated $6 million to the Université de Montréal and $1 million to McGill University for AI research (over 5 years). Last year, Facebook introduced DeepText--a deep learning based text understanding engine.


Practical Deep Learning with PyTorch - Udemy

@machinelearnbot

Although many courses are very mathematical or too practical in nature, this course strikes a careful balance between the two to provide a solid foundation in deep learning for you to explore further if you are interested in research in the field of deep learning and/or applied deep learning. It is purposefully made for anyone without a strong background in mathematics. And for those with a strong background, it would accelerate your learning in understanding the different models in deep learning. This is not a course that emphasizes heavily on the mathematics behind deep learning.


Deep Learning Prerequisites: Linear Regression in Python

@machinelearnbot

This course teaches you about one popular technique used in machine learning, data science and statistics: linear regression. Linear regression is the simplest machine learning model you can learn, yet there is so much depth that you'll be returning to it for years to come. We will apply multi-dimensional linear regression to predicting a patient's systolic blood pressure given their age and weight. If you want more than just a superficial look at machine learning models, this course is for you.


Harri Valpola dreams of an internet of beautiful AI minds

#artificialintelligence

Valpola, 44, is founder of The Curious AI Company, a 20-person artificial intelligence startup based in Helsinki, which has just raised $3.67 million in funding – small change compared to many tech funding rounds, but an impressive sum for a company that has no products and is only interested in research. Wanting to put his theories into practice, Valpola co-founded ZenRobotics, a startup building brains for intelligent robots. At this year's Conference on Neural Information Processing Systems (the leading conference in AI, better known as NIPS), he is going to present a cousin of the ladder network, punningly entitled Mean Teacher. "I've met Harri a few times, and we have similar views on AI and deep learning," says Murray Shanahan, professor of cognitive robotics at Imperial College London.


Machine Learning at HPC User Forum: Drilling into Specific Use Cases

@machinelearnbot

Dr. Weng-Keen Wong from the NSF echoed much the same distinction between the specific and general case algorithm during his talk "Research in Deep Learning: A Perspective From NSF" and was also mentioned by Nvidia's Dale Southard during the disruptive technology panel. Tim Barr's (Cray) "Perspectives on HPC-Enabled AI" showed how Cray's HPC technologies can be leveraged for Machine and Deep Learning for vision, speech and language. Fresh off their integration of SGI technology into their technology stack, the talk not only highlighted the newer software platforms which the learning systems leverage, but demonstrated that HPE's portfolio of systems and experience in both HPC and hyper scale environments is impressive indeed. Stand-alone image recognition is really cool, but as expounded upon above, the true benefit from deep learning is having an integrated workflow where data sources are ingested by a general purpose deep learning platform with outcomes that benefit business, industry and academia.


[P] A few questions on building a linear neural network from scratch. • r/MachineLearning

#artificialintelligence

I am working on developing a neural network from scratch. As of right now all of the framework is complete and I have a simple game that I am testing the neural network on. I believe my current problems lie in back propagation. I was wondering if there is more that needs to be done for back propagation and if my techniques seem sound.


[R] Cyclical Learning Rates for Training Neural Networks • r/MachineLearning

@machinelearnbot

Submission statement: Finding the correct learning rate is a pain. But this paper shows how to find reasonable learning rate bounds. Then you can cyclically vary your learning rate to getting better accuracy and often a decreasing training time. P.S There is a PR for this in keras-contrib.


Deep Learning Research Review: Natural Language Processing

@machinelearnbot

The traditional approach to NLP involved a lot of domain knowledge of linguistics itself. Understanding terms such as phonemes and morphemes were pretty standard as there are whole linguistic classes dedicated to their study. Let's look at how traditional NLP would try to understand the following word.


Deep learning must happen at the edge, too - SiliconANGLE

#artificialintelligence

We've written about a number of them at Wikibon: machine learning systems that extend the useful life of ERP systems in the grocery business; digital twin software that can dramatically improve automation in complex operations; and rapidly evolving technologies for accelerating productivity in information technology operations management, or ITOM, without which advances in other digital business domains would be impossible. That got the Wikibon research team thinking: Where will deep learning processing take place? Moreover, the rapid advances in hardware technologies that are powering the development of the cloud are also reshaping computing possibilities at the edge, in local machines and human-friendly, mobile devices. Action item: Business leaders must explore the new generation of artificial intelligence technologies, which will have profound product, operations and customer experience implications in all industries.