Results


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The findings are an important step toward building more energy-efficient computing systems that also are capable of learning and adaptation in the real world. They were published last week in a paper in the journal Nature Communications. The researchers, Bipin Rajendran, an associate professor of electrical and computer engineering, and S. R. Nandakumar, a graduate student in electrical engineering, have been developing brain-inspired computing systems that could be used for a wide range of big data applications. Over the past few years, deep learning algorithms have proven to be highly successful in solving complex cognitive tasks such as controlling self-driving cars and language understanding. At the heart of these algorithms are artificial neural networks -- mathematical models of the neurons and synapses of the brain -- that are fed huge amounts of data so that the synaptic strengths are autonomously adjusted to learn the intrinsic features and hidden correlations in these data streams.


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The brain and all its magnificent capabilities is powered by less than 20 watts. Stop to think about that for a second. As I write this blog my laptop is using about 80 watts, yet at only a fourth of the power, our brain outperforms state-of-the-art supercomputers by several orders of magnitude when it comes to energy efficiency and volume. For this reason it shouldn't be surprising that scientists around the world are seeking inspiration from the human brain as a promising avenue towards the development of next generation AI computing systems and while the IT industry has made significant progress in the past several years, particularly in using machine learning for computer vision and speech recognition, current technology is hitting a wall when it comes to deep neural networks matching the power efficiency of their biological counterpart, but this could be about to change. As reported last week in Nature Communications, my colleagues and I at IBM Research and collaborators at EPFL and the New Jersey Institute of Technology have developed and experimentally tested an artificial synapse architecture using 1 million devices--a significant step towards realizing large-scale and energy efficient neuromorphic computing technology.


Management AI: Types Of Machine Learning Systems

#artificialintelligence

Developers know a lot about the machine learning (ML) systems they create and manage, that's a given. However, there is a need for non-developers to have a high level understanding of the types of systems. Artificial neural networks and expert systems are the classical two key classes. With the advanced in computing performance, software capabilities and algorithm complexity, analytical algorithm can arguably be said to have joined the other two. This article is an overview of the three types.


Synaptic Architecture for Brain Inspired Computing: IBM Research

#artificialintelligence

Our brain and all its magnificent capabilities is powered by less than 20 watts. Stop to think about that for a second. As I write this blog my laptop is using about 80 watts, yet at only a fourth of the power, our brain outperforms state-of-the-art supercomputers by several orders of magnitude when it comes to energy efficiency and volume. For this reason it shouldn't be surprising that scientists around the world are seeking inspiration from the human brain as a promising avenue towards the development of next generation AI computing systems and while the IT industry has made significant progress in the past several years, particularly in using machine learning for computer vision and speech recognition, current technology is hitting a wall when it comes to deep neural networks matching the power efficiency of their biological counterpart, but this could be about to change. As reported last week in Nature Communications, my colleagues and I at IBM Research and collaborators at EPFL and the New Jersey Institute of Technology have developed and experimentally tested an artificial synapse architecture using 1 million devices -- a significant step towards realizing large-scale and energy efficient neuromorphic computing technology.


Management AI: Types Of Machine Learning Systems

Forbes Technology

Developers know a lot about the machine learning (ML) systems they create and manage, that's a given. However, there is a need for non-developers to have a high level understanding of the types of systems. Artificial neural networks and expert systems are the classical two key classes. With the advanced in computing performance, software capabilities and algorithm complexity, analytical algorithm can arguably be said to have joined the other two. This article is an overview of the three types.


Brain-based circuitry just made artificial intelligence a whole lot faster

#artificialintelligence

We take the vast computing power of our brains for granted. But scientists are still trying to get computers to the brain's level. This is how we ended up with artificial intelligence algorithms that learn through virtual neurons -- the neural net. Now a team of engineers has taken another step closer to emulating the computers in our noggins: they've built a physical neural network, with circuits that even more closely resemble neurons. When they tested an AI algorithm on the new type of circuitry, they found that it performed as well as conventional neural nets already in use.


Samsung hires AI experts for research boost

ZDNet

Samsung Electronics has hired two experts in artificial intelligence (AI) as part of its plan to expand its global research capabilities in the area. The new recruits are Dr H Sebastian Seung, Evnin professor in the Neuroscience Institute and Department of Computer Science at Princeton University, and Dr Daniel D Lee, the UPS Foundation chair professor in the School of Engineering and Applied Science at the University of Pennsylvania. The two will work at Samsung Research, the South Korean tech giant's research arm, and "play a central role in building up fundamental research on AI," the company said. Seung is an expert in machines and brains and Lee an expert in robotic systems. Drawing inspiration from the brain, the two researchers together developed algorithms for machine learning by nonnegative matrix factorization.


Research on the Brain-inspired Cross-media Neural Cognitive Computing Framework

arXiv.org Artificial Intelligence

To address modeling problems of brain-inspired intelligence, this thesis is focused on researching in the semantic-oriented framework design for image, audio, language and video. The Multimedia Neural Cognitive Computing (MNCC) model was designed based on the nervous mechanism and cognitive architecture. Furthermore, the semantic-oriented hierarchical Cross-media Neural Cognitive Computing (CNCC) framework was proposed based on MNCC, and formal description and analysis for CNCC was given. It would effectively improve the performance of semantic processing for multimedia information, and has far-reaching significance for exploration and realization brain-inspired computing.


Blockchain and Artificial Intelligence

@machinelearnbot

– Blockchain is a mystery story or provides the foundation for cryptocurrencies like Bitcoin. What's different about blockchains compared to traditional big-data distributed databases like MongoDB. Its like featuring a product that contains small blocks of brain in form of dust but consider that the innovation efforts of several publicly traded asset managers and banks are also on this brain block dust quest. Computers start simulating the brain's sensation, action, interaction, perception and cognition abilities. Blockchain is a new approach to manage/monitor financial and other transactions, Guarding an innovation department or powerhouse lab is a smart setup without inbuilt component of artificial intelligence is like an effort of joining blocks without reference of previous block.


We Almost Gave Up On Building Artificial Brains - The Crux

#artificialintelligence

Today artificial neural networks are making art, writing speeches, identifying faces and even driving cars. It feels as if we're riding the wave of a novel technological era, but the current rise in neural networks is actually a renaissance of sorts. It may be hard to believe, but artificial intelligence researchers were already beginning to see the promise in neural networks during World War II in their mathematical models. But by the 1970s, the field was ready to give up on them entirely. "[T]here were no impressive results until computers grew up, that is until the past 10 years," Patrick Henry Winston, a professor at MIT who specializes in artificial intelligence, says.