Neural Networks



AI, machine learning and deep learning: What's the difference? - IBM IT Infrastructure Blog

#artificialintelligence

It's not unusual today to see people talking about artificial intelligence (AI). When I was a kid in the 1980s, AI was depicted in Hollywood movies, but its real-world use was unimaginable given the state of technology at that time. While we don't have robots or androids that can think like a person or are likely to take over the world, AI is a reality now, and to understand what we mean when we talk about AI today we have to go through a -- quick, I promise -- introduction on some important terms. Simply put, AI is anything capable of mimicking human behavior. From the simplest application -- say, a talking doll or an automated telemarketing call -- to more robust algorithms like the deep neural networks in IBM Watson, they're all trying to mimic human behavior.


Where Common Machine Learning Myths Come From - InformationWeek

#artificialintelligence

Forrester Research recently released a report entitled, Shatter the Seven Myths of Machine Learning. In it, the authors warn, "Unfortunately, there is a pandemic of ML misconceptions and literacy among business leaders who must make critical decisions about ML projects." When executives and managers talk about AI and machine learning, they sometimes make factual mistakes that reveal their true level of knowledge. Forrester senior analyst Kjell Carlsson, who is the lead author of the report, said in a recent interview that he's heard audible sighs over the phone when experts hear what lay people have to say. "When the head of product says something like, 'We're using reinforcement learning because we're incorporating user feedback into the trends modeling,' that's probably not a good thing," said Carlsson.


Fish Detection Using Deep Learning

#artificialintelligence

Recently, human being's curiosity has been expanded from the land to the sky and the sea. Besides sending people to explore the ocean and outer space, robots are designed for some tasks dangerous for living creatures. Take the ocean exploration for an example. There are many projects or competitions on the design of Autonomous Underwater Vehicle (AUV) which attracted many interests. Authors of this article have learned the necessity of platform upgrade from a previous AUV design project, and would like to share the experience of one task extension in the area of fish detection. Because most of the embedded systems have been improved by fast growing computing and sensing technologies, which makes them possible to incorporate more and more complicated algorithms. In an AUV, after acquiring surrounding information from sensors, how to perceive and analyse corresponding information for better judgement is one of the challenges. The processing procedure can mimic human being's learning routines. An advanced system with more computing power can facilitate deep learning feature, which exploit many neural network algorithms to simulate human brains. In this paper, a convolutional neural network (CNN) based fish detection method was proposed.


Artificial intelligence reveals how light flows around nanoparticles – Physics World

#artificialintelligence

Artificial intelligence has been used to quickly and accurately model the 3D flow of light around arbitrarily shaped nanoparticles. Peter Wiecha and Otto Muskens at the University of Southampton in the UK demonstrated the modelling approach using a neural network that required just a single training procedure. Their technique could be used to design a wide range of optical devices that control the paths taken by light. When light interacts with nanostructures that are smaller in size than the wavelength of the light, the result can be very different from how light interacts with larger structures and continuous media. The field of nanophotonics seeks to exploit this by designing nanoparticles with particular shapes and compositions with the aim of manipulating light in specific ways.


Machine Learning Improves Satellite Rainfall Estimates - Eos

#artificialintelligence

Spaceborne precipitation observing systems can provide global coverage but estimates typically suffer from uncertainties and biases. Conversely, ground based systems such as rain gauges and precipitation radar have higher accuracy but only limited spatial coverage. Chen et al. [2019] have developed a novel deep learning algorithm designed to construct a hybrid rainfall estimation system, where the ground radar is used to bridge the scale gaps between (accurate) rain gauge measurements and (less accurate) satellite observations. Such a non-parametric deep learning technique shows the potential for regional and global rainfall mapping and can also be expanded as a data fusion platform through incorporation of additional precipitation estimates such as outputs of numerical weather prediction models.


Machine Learning Improves Satellite Rainfall Estimates - Eos

#artificialintelligence

Spaceborne precipitation observing systems can provide global coverage but estimates typically suffer from uncertainties and biases. Conversely, ground based systems such as rain gauges and precipitation radar have higher accuracy but only limited spatial coverage. Chen et al. [2019] have developed a novel deep learning algorithm designed to construct a hybrid rainfall estimation system, where the ground radar is used to bridge the scale gaps between (accurate) rain gauge measurements and (less accurate) satellite observations. Such a non-parametric deep learning technique shows the potential for regional and global rainfall mapping and can also be expanded as a data fusion platform through incorporation of additional precipitation estimates such as outputs of numerical weather prediction models.


Anomaly Detection: When Old Statistics School May Still Beat Super-Duper Machine Learning

#artificialintelligence

No machine learning or god-given algorithm can do better than that. On the other hand, if you do NOT know the density of signal then the alternative hypothesis is unspecified. This creates the situation that no test statistic may ever claim to be more powerful in distinguishing the null and alternative hypothesis, as the power of any given test statistic will depend on the unknown features of the signal. In other words, it does not matter how fast you run if you don't know where you are going. So, the win of a basic statistical learning tool over complex deep learning tools should not surprise you. And, since the future of machine learning is in unsupervised learning problems, as many experts in the field have been pointing out for some time now, you well realize how good old statistics practice remains pretty much under the spotlights, and will continue to do so in the future.


MIT 6.S191: Introduction to Deep Learning

#artificialintelligence

Sign in to report inappropriate content. MIT Introduction to Deep Learning 6.S191: Lecture 1 *New 2019 Edition* Foundations of Deep Learning Lecturer: Alexander Amini January 2019 For all lectures, slides and lab materials: http://introtodeeplearning.com


Using artificial intelligence to enrich digital maps

#artificialintelligence

A model invented by researchers at MIT and Qatar Computing Research Institute (QCRI) that uses satellite imagery to tag road features in digital maps could help improve GPS navigation. Showing drivers more details about their routes can often help them navigate in unfamiliar locations. Lane counts, for instance, can enable a GPS system to warn drivers of diverging or merging lanes. Incorporating information about parking spots can help drivers plan ahead, while mapping bicycle lanes can help cyclists negotiate busy city streets. Providing updated information on road conditions can also improve planning for disaster relief.