Results


Machine Learning - Vistatec

#artificialintelligence

We have all heard this new buzz word in the world of technology. It is the new trend, everybody wants to jump in the { 80% band wagon, 50% train, 5% lake } and we are all making claims about how amazingly good and new it is: Machine Learning. But, is it really that new? Machine Learning is just another fancy word or area within a broader field called Predictive Analysis or Predictive Modelling. This { 70% area as it was used before, 90% branch as not in text yet, better style } of the statistics was born in the 1940s when governments invested in this area for military purposes.


A brain-inspired chip from IIT-Delhi could be the next big leap in AI hardware FactorDaily

#artificialintelligence

"The human brain has 100 billion neurons, each neuron connected to 10 thousand other neurons. Sitting on your shoulders is the most complicated object in the known universe," Michio Kaku, Physicist and Futurist The human brain, which not just stores but also computes, is by far the most powerful and complex computers in the world that occupies just 1.3 litres of space and consumes about 20 watts of power. In comparison, the finest supercomputers in the world require gigawatts of power, massive real estate, infrastructure, and dedicated cooling systems while attempting to perform brain-like tasks. Understanding how the human brain functions and replicating it has been a lifelong quest for the scientific and research community. Enter neuromorphic computing, a concept developed by American scientist and researcher Carver Andress Mead in the late 1980s – which tries to emulate certain functions of the human brain in silicon.


Artificial Neural Networks: How To Understand Them And Why They're Important

#artificialintelligence

If you dip even a toe into the realm of artificial intelligence, you'll come across artificial neural networks. Artificial neural networks are the systems that power artificial intelligence. It's a type of computer that doesn't just read code that it already understands. Neural networks process vast amounts of information to help create an understanding of what's already right in front of you. People think the key to understanding neural networks is calculus, but this system of computing has roots in biology.


AI vs. Humans: Upending the Division of Labor

#artificialintelligence

Despite transitional growing pains, the promise of artificial intelligence (AI) in innovation and decision-making offers a future with better decisions made at the command of but not by humans. That's what Pradeep Dubey, director of the Parallel Computing Laboratory at Intel, told attendees of a plenary talk at the PEARC18 conference in Pittsburgh, Pa., on July 25. "Humans and machines have had this very nice separation of labor," Dubey said. "Humans make decisions; machines crunch numbers … but humans are terrible decision makers." The annual Practice and Experience in Advanced Research Computing (PEARC) conference--with the theme Seamless Creativity--stresses key objectives for those who manage, develop and use advanced research computing throughout the U.S. and the world.


This AI Calculates at the Speed of Light - D-brief

#artificialintelligence

Light, on the other hand, travels 186,282 miles in a second. Imagine the possibilities if we were that quick-witted. Well, computers are getting there. Researchers from UCLA on Thursday revealed a 3D-printed, optical neural network that allows computers to solve complex mathematical computations at the speed of light. In other words, we don't stand a chance.


This AI Calculates at the Speed of Light - D-brief

#artificialintelligence

Light, on the other hand, travels 186,282 miles in a second. Imagine the possibilities if we were that quick-witted. Well, computers are getting there. Researchers from UCLA on Thursday revealed a 3D-printed, optical neural network that allows computers to solve complex mathematical computations at the speed of light. In other words, we don't stand a chance.


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The findings are an important step toward building more energy-efficient computing systems that also are capable of learning and adaptation in the real world. They were published last week in a paper in the journal Nature Communications. The researchers, Bipin Rajendran, an associate professor of electrical and computer engineering, and S. R. Nandakumar, a graduate student in electrical engineering, have been developing brain-inspired computing systems that could be used for a wide range of big data applications. Over the past few years, deep learning algorithms have proven to be highly successful in solving complex cognitive tasks such as controlling self-driving cars and language understanding. At the heart of these algorithms are artificial neural networks -- mathematical models of the neurons and synapses of the brain -- that are fed huge amounts of data so that the synaptic strengths are autonomously adjusted to learn the intrinsic features and hidden correlations in these data streams.


Novel synaptic architecture for brain inspired computing

#artificialintelligence

The brain and all its magnificent capabilities is powered by less than 20 watts. Stop to think about that for a second. As I write this blog my laptop is using about 80 watts, yet at only a fourth of the power, our brain outperforms state-of-the-art supercomputers by several orders of magnitude when it comes to energy efficiency and volume. For this reason it shouldn't be surprising that scientists around the world are seeking inspiration from the human brain as a promising avenue towards the development of next generation AI computing systems and while the IT industry has made significant progress in the past several years, particularly in using machine learning for computer vision and speech recognition, current technology is hitting a wall when it comes to deep neural networks matching the power efficiency of their biological counterpart, but this could be about to change. As reported last week in Nature Communications, my colleagues and I at IBM Research and collaborators at EPFL and the New Jersey Institute of Technology have developed and experimentally tested an artificial synapse architecture using 1 million devices--a significant step towards realizing large-scale and energy efficient neuromorphic computing technology.


Management AI: Types Of Machine Learning Systems

#artificialintelligence

Developers know a lot about the machine learning (ML) systems they create and manage, that's a given. However, there is a need for non-developers to have a high level understanding of the types of systems. Artificial neural networks and expert systems are the classical two key classes. With the advanced in computing performance, software capabilities and algorithm complexity, analytical algorithm can arguably be said to have joined the other two. This article is an overview of the three types.


Synaptic Architecture for Brain Inspired Computing: IBM Research

#artificialintelligence

Our brain and all its magnificent capabilities is powered by less than 20 watts. Stop to think about that for a second. As I write this blog my laptop is using about 80 watts, yet at only a fourth of the power, our brain outperforms state-of-the-art supercomputers by several orders of magnitude when it comes to energy efficiency and volume. For this reason it shouldn't be surprising that scientists around the world are seeking inspiration from the human brain as a promising avenue towards the development of next generation AI computing systems and while the IT industry has made significant progress in the past several years, particularly in using machine learning for computer vision and speech recognition, current technology is hitting a wall when it comes to deep neural networks matching the power efficiency of their biological counterpart, but this could be about to change. As reported last week in Nature Communications, my colleagues and I at IBM Research and collaborators at EPFL and the New Jersey Institute of Technology have developed and experimentally tested an artificial synapse architecture using 1 million devices -- a significant step towards realizing large-scale and energy efficient neuromorphic computing technology.