computational


Ensemble Methods in One Picture

#artificialintelligence

Generate the Base Learners: Choose any combination of base learners, based on accuracy and diversity. Each base learner can produce more than one predictive model, if you change variables such as case weights, guidance parameters, or input space partitions. The result is a computational "average" of sorts (which is much more complex than the regular arithmetic average). Generate the Base Learners: Choose any combination of base learners, based on accuracy and diversity. Each base learner can produce more than one predictive model, if you change variables such as case weights, guidance parameters, or input space partitions.


Artificial intelligence: embryonic

#artificialintelligence

So, this is my third column delving into what I believe is the over-hyped and often misunderstood subject of artificial intelligence (AI). Firstly, I'll briefly paraphrase what I mentioned in my previous posts and, that is, what we understand today as AI is nothing more than clever programming and smart technology. In my first post, I suggested that engineers have developed software and devised hardware to alleviate the mundaneness of routine activities, such as on the factory floor; created smart sensors to determine the best time to plant seeds and harvest and, of course, we have a number of sensors that can predict weather patterns; there are also engineers who have developed technology that can assess images of patients better than physicians when determining breast cancer, for example. As a former software engineer, I would similarly develop algorithms and functions that would seek patterns in data where, based on the data processed, I would define expected behaviors, outcomes and actions – this is not "intelligence" – it's just clever programming and smart technology. How do we create an intelligent entity?


Computational Learning Theory

#artificialintelligence

Theoretical results in machine learning mainly deal with a type of inductive learning called supervised learning. In supervised learning, an algorithm is given samples that are labeled in some useful way. For example, the samples might be descriptions of mushrooms, and the labels could be whether or not the mushrooms are edible. The algorithm takes these previously labeled samples and uses them to induce a classifier. This classifier is a function that assigns labels to samples including the samples that have never been previously seen by the algorithm.


Informatics as a Fundamental Discipline for the 21st Century

Communications of the ACM

Informatics for all is a coalition whose aim is to establish informatics as a fundamental discipline to be taken by all students in school. Informatics should be seen as important as mathematics, the sciences, and the various languages. It should be recognized by all as a truly foundational discipline that plays a significant role in education for the 21st century. In Europe, education is a matter left to the individual states. However, education, competencies, and preparedness of the workforce are all important matters for the European Union (EU).


Why AI Underperforms and What Companies Can Do About It

#artificialintelligence

Why is the gap between companies' AI ambition and their actual adoption so large? The answer is not primarily technical. It is organizational and cultural. A massive skills and language gap has emerged between key organizational decision makers and their "AI teams." It is a barrier that promises to stall, delay, or sink algorithmic innovations.


Can you learn Data Science and Machine Learning without Maths?

#artificialintelligence

Data scientists are the no. 1 most promising job in America for 2019, according to a Thursday report from LinkedIn. Hence, this comes as no surprise: Data scientist topped Glassdoor's list of Best Jobs in America for the past three years, with professionals in the field reporting high demand, high salaries, and high job satisfaction. Also, with the increase in demand, employers are looking for more skills in modern day data scientists. Furthermore, a modern-day data scientist needs to be a good player in aspects like maths, programming, communication and problem-solving. In this blog, we are going to explore if knowledge of mathematics is really necessary to become good data scientists.


Artificial intelligence platform to detect neurodegenerative diseases

#artificialintelligence

The buildup of abnormal tau proteins in the brain in neurofibrillary tangles is a feature of Alzheimer's disease, but it also accumulates in other neurodegenerative diseases, such as chronic traumatic encephalopathy and additional age-related conditions. Accurate diagnosis of neurodegenerative diseases is challenging and requires a highly-trained specialist. Researchers at the Center for Computational and Systems Pathology at Mount Sinai developed and used the Precise Informatics Platform to apply powerful machine learning approaches to digitized microscopic slides prepared using tissue samples from patients with a spectrum of neurodegenerative diseases. Applying deep learning, these images were used to create a convolutional neural network capable of identifying neurofibrillary tangles with a high degree of accuracy directly from digitized images. "Utilizing artificial intelligence has great potential to improve our ability to detect and quantify neurodegenerative diseases, representing a major advance over existing labor-intensive and poorly reproducible approaches," said lead investigator John Crary, MD, PhD, Prof. of Pathology and Neuroscience at the Icahn School of Medicine at Mount Sinai.


No, Machine Learning is not just glorified Statistics

#artificialintelligence

This meme has been all over social media lately, producing appreciative chuckles across the internet as the hype around deep learning begins to subside. The sentiment that machine learning is really nothing to get excited about, or that it's just a redressing of age-old statistical techniques, is growing increasingly ubiquitous; the trouble is it isn't true. I get it -- it's not fashionable to be part of the overly enthusiastic, hype-drunk crowd of deep learning evangelists. ML experts who in 2013 preached deep learning from the rooftops now use the term only with a hint of chagrin, preferring instead to downplay the power of modern neural networks lest they be associated with the scores of people that still seem to think that import keras is the leap for every hurdle, and that they, in knowing it, have some tremendous advantage over their competition. While it's true that deep learning has outlived its usefulness as a buzzword, as Yann LeCun put it, this overcorrection of attitudes has yielded an unhealthy skepticism about the progress, future, and usefulness of artificial intelligence.


5G will change your business faster than you think

#artificialintelligence

Not that I'm a big fan of such titles, but when I look at what 5G will bring, it's clear most businesses will feel the impact. Most technologies have a slow adoption curve. This change of speed is going to catch most companies unaware. The technological improvements of better connectivity are apparent, but their consequences aren't. There are two big groups of problems that 5G's lower latency and high bandwidth will impact. On one side, we have those problems that we can solve with low computation and real-time responses. Think of any remote controller. There is minor computation needs on the controller side but needs fast reactions on the remoter actuator.


3 New Chips to Help Robots Find Their Way Around

IEEE Spectrum Robotics

Robots have a tough job making their way in the world. Life throws up obstacles, and it takes a lot of computing power to avoid them. At the IEEE International Solid-State Circuits Conference last month in San Francisco, engineers presented some ideas for lightening that computational burden. That's a particularly good thing if you're a compact robot, with a small battery pack and a big job to do. Engineers at Intel are experimenting with robot-specific accelerators as part of a collaborative multirobot system.