Goto

Collaborating Authors

AI today and tomorrow is mostly about curve fitting, not intelligence

#artificialintelligence

As debates around AI's value continue, the risk of an AI winter is real. We need to level set what is real and what is imagined so that the next press release you see describing some amazing breakthrough is properly contextualized. Unquestionably, the latest spike of interest in AI technology using machine learning and the neuron-inspired deep learning is behind incredible advancements in many software categories. Achievements such as language translation, image and scene recognition and conversational UIs that were once the stuff of sci-fi dreams are now a reality. Even as software using AI-labeled techniques continues to yield tremendous improvements in most software categories, both academics and skeptical observers have observed that such algorithms fall far short of what can be reasonably considered intelligent.


AI today and tomorrow is mostly about curve fitting, not intelligence

#artificialintelligence

As debates around AI's value continue, the risk of an AI winter is real. We need to level set what is real and what is imagined so that the next press release you see describing some amazing breakthrough is properly contextualized. Unquestionably, the latest spike of interest in AI technology using machine learning and the neuron-inspired deep learning is behind incredible advancements in many software categories. Achievements such as language translation, image and scene recognition and conversational UIs that were once the stuff of sci-fi dreams are now a reality. Even as software using AI-labeled techniques continues to yield tremendous improvements in most software categories, both academics and skeptical observers have observed that such algorithms fall far short of what can be reasonably considered intelligent.


All The Hype Is About AI, But The Real Action Is In IA

#artificialintelligence

The Artificial Intelligence ("AI") vs Intelligence Augmentation ("IA") debate has been around for over half a century. IA or Intelligence Augmentation classically refers to the effective use of information technology in augmenting human capabilities, and the idea has been around since the 1950s. AI is increasingly being used today to broadly describe machines that can mimic human functions such as learning and problem solving, but was originally founded on the premise that human intelligence can be precisely described, and machines made to simulate it. The term Artificial General Intelligence (AGI) is often used to represent only the latter, stricter definition. There is unprecedented hype today around AI, its incredible recent growth trajectory, myriad potential applications, and its potential emergent threats to society. The broader definition of AI creates confusion, especially for those that may not be closely following the technology.


How Causal Inference Can Lead To Real Intelligence In Machines

#artificialintelligence

Last year, the machine learning community was thrown into disarray when its top minds Yann LeCun, Ali Rahimi and Judea Pearl had a faceoff on the state of artificial intelligence and machine learning. While Rahimi and Pearl tried to tone down the hype around AI, LeCun was aghast over the scepticism around intelligence and causality of the models. I see dozens of "Data Science Institutes" erected across the country, I read their manifestos and I check their advisory boards. Causality does not seem to be on their agenda. Which makes one doubt whether the Ladder has been internalized and where this hype will end.


The Silent Rockstar of BigData: Machine Learning

#artificialintelligence

Too much data and too few people: Firstly, this is a no surprise that machine learning algorithms will work at the pace not matching their counter scientist friends. If trained properly, machine could easily pacify majority of data preparation and analysis demand in data analytics world. Another cool thing about machine learning is that once code is prepped and machine is programmed, you could use it multiple times and multiple places and see the magic happen. The trick is to not overkill first but to use it for overhead tasks first and keep making it more and more sophisticated, so that it will start doing all the heavy lifting and pacifying the resource demand as a result. Hence, machine learning single handedly can reduce big-data resource crunch and make the resource distribution relevant and appropriately.