Deep Learning


Artificial Intelligence to Sort Through ISR Data Glut

#artificialintelligence

Inundated with more data than humans can analyze, the U.S. military and intelligence community are banking on machine learning and advanced computing technologies to separate the wheat from the chaff. The Defense Department operates more than 11,000 drones that collect hundreds of thousands of hours of video footage every year. "When it comes to intelligence, surveillance and reconnaissance, or ISR, we have more platforms and sensors than at any time in Department of Defense history," said Air Force Lt. Gen. John N.T. "Jack" Shanahan, director for defense intelligence (warfighter support) in the office of the undersecretary of defense for intelligence. "It's an avalanche of data that we are not capable of fully exploiting," he said at a technology conference in Washington, D.C., hosted by Nvidia, a Santa Clara, California-based artificial intelligence computing company. For example, the Pentagon has deployed a wide-area motion imagery sensor that can look at an entire city.


High Bandwidth Memory: The Great Awakening of AI

#artificialintelligence

Artificial intelligence (AI) is fast becoming one of the most important areas of digital expansion in history. The CEO of Applied Materials recently stated that "the war" for AI leadership will be the "biggest battle of our lifetime."1 AI promises to transform almost every industry, including healthcare (diagnosis, treatments), automotive (autonomous driving), manufacturing (robot assembly), and retail (purchasing assistance). Although the field of AI has been around since the 1950s, it was not until very recently that computing power and the methods used in AI have reached a tipping point for major disruption and rapid advancement. Both of these areas have a tremendous need for much higher memory bandwidth.


Transforming financial services with AI technologies

#artificialintelligence

The financial services industry is undergoing one of the largest transformational shifts in decades, driven by the development of new digital products and services, broadening availability of powerful computing solutions, and increased customer adoption of cloud, mobile, web-based and AI technologies. As the financial industry increasingly realizes the impact of faster analytical insights on overall business strategy, AI techniques like machine learning are permeating nearly every industry. Deep learning, the fastest-growing field in machine learning, leverages many-layered deep neural networks (DNNs) to learn levels of representation and abstraction that make sense of data such as images, sound, and text. This technique is showing great promise for automating a variety of operational processes and ushering in disruptive new business models for the industry. However, these newfound capabilities are quickly pushing conventional computing architectures to their limits.


Deep Learning Frameworks Hands-on Review – Knowm.org

@machinelearnbot

At Knowm, we are building a new and exciting type of computer processor to accelerate machine learning (ML) and artificial intelligence applications. The goal of Thermodynamic-RAM (kT-RAM) is to run general ML operations, traditionally deployed to CPUs and GPUs, to a physically-adaptive analog processor based on memristors which unites memory and processing. If you haven't heard yet, we call this new way of computing "AHaH Computing", which stands for Anti-Hebbian and Hebbian Computing, and it provides a universal computing framework for in-memory reconfigurable logic, memory, and ML. While we have shown a long time ago that AHaH Computing is capable of solving problems across many domains of ML, we only recently figured out how to use the kT-RAM instruction set and low precision/noisy memristors to build supervised and unsupervised compositional (deep) ML systems. Our method does not require the propagation of error algorithm (Backprop) and is easy to attain with realistic analog hardware, including but not limited to memristors.


IBM targets AI workloads with POWER9 systems; claims to be faster than x86 - CIOL

#artificialintelligence

Speed to insight is going to emerge as the key competitive differentiator for businesses, as they start stepping into the era of compute-and-speed-hungry artificial intelligence(AI), and deep learning workloads. IBM, recently announced a new line of accelerated IBM Power Systems Servers, keeping this new requirement of businesses in mind. The systems are built on its new POWER9 processor, which reduces the training times of deep learning frameworks significantly from days to hours and allows building more accurate AI applications in considerably less time. "The era of AI demands a tremendous amount of processing power at unprecedented speed," said Monica Aggarwal, Vice President, IBM India Systems Development Lab. "To meet the demands of the cognitive workload, businesses need to change everything right from the start- the algorithms, the software, and the hardware as well. POWER9 systems bring an integrated AI platform designed to accelerate machine learning and deep learning with both software and hardware that are optimized to work together."


Must Read: Top 7 Technology Trends for 2018 - Engineering

#artificialintelligence

"The Computer Society's predictions, based on a deep-dive analysis by a team of leading technology experts, identify top-trending technologies that hold extensive disruptive potential for 2018," said Jean-Luc Gaudiot, IEEE Computer Society President. "The vast computing community depends on the Computer Society as the provider for relevant technology news and information, and our predictions directly align with our commitment to keeping our community well-informed and prepared for the changing technological landscape of the future." Dejan Milojicic, Hewlett Packard Enterprise Distinguished Technologist and IEEE Computer Society past president, said, "The following year we will witness some of the most intriguing dilemmas in the future of technology. Will deep learning and AI indeed expand deployment domains or remain within the realms of neural networks? Will cryptocurrency technologies keep their extraordinary evolution or experience a bubble burst?


Huawei signs AI mobile agreement with Baidu

ZDNet

Huawei has announced signing a strategic agreement to build an open mobile artificial intelligence (AI) ecosystem with Chinese search engine giant Baidu. The strategic cooperation agreement covers AI platforms, technology, internet services, and content ecosystems, Huawei said. The open ecosystem will be built using Huawei's HiAI platform and neural network processing unit (NPU); and Baidu's PaddlePaddle deep-learning framework and Baidu Brain, which contains Baidu's AI services and assets. It will allow AI developers to make use of the technology. Under the partnership, Baidu and Huawei will also work on improved voice and image recognition on smart devices, and will built a consumer augmented reality (AR) software and hardware ecosystem.


AI - Technology of the year

#artificialintelligence

As 2017 comes to a close, I have been noodling about what deserves the title of "Technology of the year." Clearly, Artificial Intelligence (AI) is the winner! Quite a few terms are used interchangeably when discussing the subject of AI, including Deep Learning, Machine Learning, Neural Networks, Graph Theory, Random Forests, and the list goes on. AI is the broad subject, describing how intelligence is gained through machine learning using various algorithmic options like graph theory, neural networks, random forests, etc. Deep learning is a specialized form of machine learning which expands the sample data sets to multi-layer learning. I first worked on Artificial Intelligence during my final semester of engineering school.


AI - Technology of the year

#artificialintelligence

As 2017 comes to a close, I have been noodling about what deserves the title of "Technology of the year." Clearly, Artificial Intelligence (AI) is the winner! Quite a few terms are used interchangeably when discussing the subject of AI, including Deep Learning, Machine Learning, Neural Networks, Graph Theory, Random Forests, and the list goes on. AI is the broad subject, describing how intelligence is gained through machine learning using various algorithmic options like graph theory, neural networks, random forests, etc. Deep learning is a specialized form of machine learning which expands the sample data sets to multi-layer learning. I first worked on Artificial Intelligence during my final semester of engineering school.


A Deep Dive on AWS DeepLens - The New Stack

#artificialintelligence

Last week at the Amazon Web Services' re:Invent conference, AWS and Intel introduced a new video camera, AWS DeepLens, that acts as an intelligent device that can run deep learning algorithms on captured images in real-time. The key difference between DeepLens and any other AI-powered camera lies in the horsepower that makes it possible to run machine learning inference models locally without ever sending the video frames to the cloud. Developers and non-developers rushed to attend the AWS workshop on DeepLens to walk away with a device. There, they were enticed with a hot dog to perform the infamous "Hot Dog OR Not Hot Dog" experiment. I managed to attend one of the repeat sessions, and carefully ferried the device back home.