deep learning


Predicting Portland Home Prices

#artificialintelligence

Predicting Portland home prices allowed me to do this because I was able to incorporate various web scraping techniques, natural language processing on text, deep learning models on images, and gradient boosting into tackling the problem. The Zillow metadata contained the descriptors you would expect - square footage, neighborhood, year built, etc. Okay, now that I was confident that my image model was doing a good job, I was ready to combine the Zillow metadata, realtor description word matrix, and the image feature matrix into one matrix and then implement gradient boosting in order to predict home prices. Incorporating the images into my model immediately dropped that error by $20 K. Adding in the realtor description to that dropped it by another $10 K. Finally, adding in the Zillow metadata lowered the mean absolute error to approximately $71 K. Perhaps you are wondering how well the Zillow metadata alone would do in predicting home prices?


How Facebook Is Using Artificial Intelligence

@machinelearnbot

With a vision that "artificial intelligence can play a big role in helping bring the world closer together," Facebook has opened a new AI research lab in Montreal as part of Facebook AI Research (FAIR). In addition, Facebook announced a $7 million in AI support for Canadian Institute for Advanced Research (CIFAR), the Montreal Institute for Learning Algorithms (MILA), McGill University, and Université de Montréal (over 5 years). The software giant even allocated $6 million to the Université de Montréal and $1 million to McGill University for AI research (over 5 years). Last year, Facebook introduced DeepText--a deep learning based text understanding engine.


Machine Learning And The Future Of Media - Disruption Hub

#artificialintelligence

Disrupting traditional media Machine learning has disrupted traditional media by enabling personalisation. However, as much as data analysis and deep learning has disrupted traditional media distribution, machine learning still relies on human judgement to work effectively. Despite fears that machine learning could threaten editorial jobs, online social platform Whisper delivers content developed using deep learning and data mining techniques to over 30 million monthly users. Even when controlled by human editorial teams, data analysis and deep learning have disrupted traditional media distribution by essentially killing off serendipitous discovery.


Practical Deep Learning with PyTorch - Udemy

@machinelearnbot

Although many courses are very mathematical or too practical in nature, this course strikes a careful balance between the two to provide a solid foundation in deep learning for you to explore further if you are interested in research in the field of deep learning and/or applied deep learning. It is purposefully made for anyone without a strong background in mathematics. And for those with a strong background, it would accelerate your learning in understanding the different models in deep learning. This is not a course that emphasizes heavily on the mathematics behind deep learning.


Deep Learning Prerequisites: Linear Regression in Python

@machinelearnbot

This course teaches you about one popular technique used in machine learning, data science and statistics: linear regression. Linear regression is the simplest machine learning model you can learn, yet there is so much depth that you'll be returning to it for years to come. We will apply multi-dimensional linear regression to predicting a patient's systolic blood pressure given their age and weight. If you want more than just a superficial look at machine learning models, this course is for you.


Machine Learning at HPC User Forum: Drilling into Specific Use Cases

@machinelearnbot

Dr. Weng-Keen Wong from the NSF echoed much the same distinction between the specific and general case algorithm during his talk "Research in Deep Learning: A Perspective From NSF" and was also mentioned by Nvidia's Dale Southard during the disruptive technology panel. Tim Barr's (Cray) "Perspectives on HPC-Enabled AI" showed how Cray's HPC technologies can be leveraged for Machine and Deep Learning for vision, speech and language. Fresh off their integration of SGI technology into their technology stack, the talk not only highlighted the newer software platforms which the learning systems leverage, but demonstrated that HPE's portfolio of systems and experience in both HPC and hyper scale environments is impressive indeed. Stand-alone image recognition is really cool, but as expounded upon above, the true benefit from deep learning is having an integrated workflow where data sources are ingested by a general purpose deep learning platform with outcomes that benefit business, industry and academia.


How AI apps for banks are changing the face of the financial sector

#artificialintelligence

This not only helps end users quickly get vital inputs on suitable financial products, but also helps banks market and sell the most appropriate products to users. These AI-based applications can integrate with a user's online bank accounts, debit and credit cards, and e-wallets to track their expenses, present advice on better expense management practices, and help them choose more suitable financial products that sit well with their financial habits, liquidity requirements, and short-term saving goals. With all these information inputs and highly sophisticated algorithms, these AI models are able to make investment decisions very quickly. Very soon, financial services will recognize the dire need to adopt AI applications to deliver sophisticated, personalized, and highly secure services to clients.


Deep Learning Research Review: Natural Language Processing

@machinelearnbot

The traditional approach to NLP involved a lot of domain knowledge of linguistics itself. Understanding terms such as phonemes and morphemes were pretty standard as there are whole linguistic classes dedicated to their study. Let's look at how traditional NLP would try to understand the following word.


The latest iPhones show why A.I. is the new electricity

#artificialintelligence

The "Neural Engine" is designed to work with Apple's Core ML developer tools, which exist for app developers to gain easy access to the power of machine learning. If you were to list the emerging technologies that will define the coming few decades, these might include augmented and virtual reality, self-driving cars, gene therapy and many others.


Deep learning must happen at the edge, too - SiliconANGLE

#artificialintelligence

We've written about a number of them at Wikibon: machine learning systems that extend the useful life of ERP systems in the grocery business; digital twin software that can dramatically improve automation in complex operations; and rapidly evolving technologies for accelerating productivity in information technology operations management, or ITOM, without which advances in other digital business domains would be impossible. That got the Wikibon research team thinking: Where will deep learning processing take place? Moreover, the rapid advances in hardware technologies that are powering the development of the cloud are also reshaping computing possibilities at the edge, in local machines and human-friendly, mobile devices. Action item: Business leaders must explore the new generation of artificial intelligence technologies, which will have profound product, operations and customer experience implications in all industries.