Goto

Collaborating Authors

Global Big Data Conference

#artificialintelligence

Over the last several years, deep learning -- a subset of machine learning in which artificial neural networks imitate the inner workings of the human brain to process data, create patterns and inform decision-making -- has been responsible for significant advancements in the field of artificial intelligence. Building on what is possible with the human brain, deep learning is now capable of unsupervised learning from data that is unstructured or unlabeled. This data, often referred to as big data, can be drawn from various sources such as social media, internet history and e-commerce platforms, among others. These sources of data are so vast that it could take decades for humans to comprehend it and extract relevant information, but interpreting this data through deep learning allows models to detect objects, recognize speech, translate language and make decisions at remarkable speeds. Many companies realize the incredible potential that can result from unraveling this wealth of information and are increasingly adopting AI systems driven by deep learning to gain a competitive advantage through data and automation.


How sparsification and quantization build leaner AI

#artificialintelligence

Artificial Intelligence (AI) and Machine Learning (ML) are rarely out of the news. Technology vendors are busy jostling for position in the AI-ML marketplace, all keen to explain how their approach to automation can speed everything from predictive maintenance for industrial machinery to knowing what day consumers are most likely to order vegan sausages in their online shopping orders. Much of the debate around AI itself concerns the resultant software tooling that tech vendors bring to market. We want to know more about how so-called'explainable' AI functions function and what those advancements can do for us. A key part of that explainability concentrates on AI bias and the need to ensure human unconscious (or perhaps semiconscious) thinking is not programmed into the systems we are creating.


Council Post: How Can Businesses Take Deep Learning Out Of The Lab And Onto Intelligent Edge Devices?

#artificialintelligence

Dr. Eli David is a leading AI expert specializing in deep learning and evolutionary computation. He is the co-founder of DeepCube. Over the last several years, deep learning has proved to be the key driver of AI advancement and improvements. Drawing from how the human brain operates, deep learning is responsible for advancing AI applications from computer vision to speech recognition to text and data analysis. Deep learning models are trained in research labs using large amounts of training data to demonstrate how the technology could manifest in real-world deployments.


As Machines Get Smarter, Evidence Grows That They Learn Like Us

AITopics Original Links

The brain performs its canonical task -- learning -- by tweaking its myriad connections according to a secret set of rules. To unlock these secrets, scientists 30 years ago began developing computer models that try to replicate the learning process. Now, a growing number of experiments are revealing that these models behave strikingly similar to actual brains when performing certain tasks. The algorithm used by a computer model called the Boltzmann machine, invented by Geoffrey Hinton and Terry Sejnowski in 1983, appears particularly promising as a simple theoretical explanation of a number of brain processes, including development, memory formation, object and sound recognition, and the sleep-wake cycle. "It's the best possibility we really have for understanding the brain at present," said Sue Becker, a professor of psychology, neuroscience, and behavior at McMaster University in Hamilton, Ontario.


The Brain as Computer: Bad at Math, Good at Everything Else

#artificialintelligence

Painful exercises in basic arithmetic are a vivid part of our elementary school memories. A multiplication like 3,752 6,901 carried out with just pencil and paper for assistance may well take up to a minute. Of course, today, with a cellphone always at hand, we can quickly check that the result of our little exercise is 25,892,552. Indeed, the processors in modern cellphones can together carry out more than 100 billion such operations per second. What's more, the chips consume just a few watts of power, making them vastly more efficient than our slow brains, which consume about 20 watts and need significantly more time to achieve the same result. Of course, the brain didn't evolve to perform arithmetic.