Teaching Methods


Scientists develop machine-learning method to predict the behavior of molecules

#artificialintelligence

"By identifying patterns in molecular behavior, the learning algorithm or'machine' we created builds a knowledge base about atomic interactions within a molecule and then draws on that information to predict new phenomena," explains New York University's Mark Tuckerman, a professor of chemistry and mathematics and one of the paper's primary authors. The research team created a machine that can learn complex interatomic interactions, which are normally prescribed by complex quantum mechanical calculations, without having to perform such intricate calculations. To weigh the viability of the tool, they examined how the machine predicted the chemical behavior and then compared their prediction with our current chemical understanding of the molecule. The results revealed how much the machine could learn from the limited training data it had been given.


XGBoost, a Top Machine Learning Method on Kaggle, Explained

#artificialintelligence

Specifically, it was engineered to exploit every bit of memory and hardware resources for tree boosting algorithms. The implementation of XGBoost offers several advanced features for model tuning, computing environments and algorithm enhancement. It is capable of performing the three main forms of gradient boosting (Gradient Boosting (GB), Stochastic GB and Regularized GB) and it is robust enough to support fine tuning and addition of regularization parameters. XGBoost specifically, implements this algorithm for decision tree boosting with an additional custom regularization term in the objective function.


Self learning chip promises to accelerate artificial learning Robotics Research

#artificialintelligence

Fully asynchronous neuromorphic many core mesh that supports a wide range of sparse, hierarchical and recurrent neural network topologies with each neuron capable of communicating with thousands of other neurons. Each neuromorphic core includes a learning engine that can be programmed to adapt network parameters during operation, supporting supervised, unsupervised, reinforcement and other learning paradigms. Fully asynchronous neuromorphic many core mesh that supports a wide range of sparse, hierarchical and recurrent neural network topologies with each neuron capable of communicating with thousands of other neurons. Each neuromorphic core includes a learning engine that can be programmed to adapt network parameters during operation, supporting supervised, unsupervised, reinforcement and other learning paradigms.


Intel announces self-learning AI chip Loihi ZDNet

#artificialintelligence

Intel has announced a neuromorphic artificial intelligence (AI) test chip named Loihi, which it said is aimed at mimicking brain functions by learning from data gained from its environment. The chip has fabrication on Intel's 14nm process tech; 130,000 neurons; 139 million synapses; a fully asynchronous neuromorphic many-core mesh supporting sparse, hierarchical, and recurrent neural network topologies, with neurons capable of communicating with each other; a programmable learning engine for each neuromorphic core; and development and testing of several algorithms for path planning, sparse coding, dictionary learning, constraint satisfaction, and dynamic pattern learning and adaptation. Intel researchers have shown a learning rate 1 million times improved with typical spiking neural sets, he claimed, with 1,000 times more energy efficiency than typical computing used for training systems. CTO of Intel Artificial Intelligence Product Group Amir Khosrowshahi -- who co-founded Nervana Systems, which was purchased by the chip giant in August last year as the central part of Intel's plans for AI -- had in April told ZDNet that the industry needs new architecture for neural networks.


Intel announces self-learning AI chip Loihi

ZDNet

Intel has announced a neuromorphic artificial intelligence (AI) test chip named Loihi, which it said is aimed at mimicking brain functions by learning from data gained from its environment. The chip has fabrication on Intel's 14nm process tech; 130,000 neurons; 139 million synapses; a fully asynchronous neuromorphic many-core mesh supporting sparse, hierarchical, and recurrent neural network topologies, with neurons capable of communicating with each other; a programmable learning engine for each neuromorphic core; and development and testing of several algorithms for path planning, sparse coding, dictionary learning, constraint satisfaction, and dynamic pattern learning and adaptation. Intel researchers have shown a learning rate 1 million times improved with typical spiking neural sets, he claimed, with 1,000 times more energy efficiency than typical computing used for training systems. CTO of Intel Artificial Intelligence Product Group Amir Khosrowshahi -- who co-founded Nervana Systems, which was purchased by the chip giant in August last year as the central part of Intel's plans for AI -- had in April told ZDNet that the industry needs new architecture for neural networks.


Intel's New Self-Learning Chip Promises to Accelerate Artificial Intelligence Intel Newsroom

#artificialintelligence

The Loihi research test chip includes digital circuits that mimic the brain's basic mechanics, making machine learning faster and more efficient while requiring lower compute power. Compared to technologies such as convolutional neural networks and deep learning neural networks, the Loihi test chip uses many fewer resources on the same task. The self-learning capabilities prototyped by this test chip have enormous potential to improve automotive and industrial applications as well as personal robotics – any application that would benefit from autonomous operation and continuous learning in an unstructured environment. Today, we at Intel are applying our strength in driving Moore's Law and manufacturing leadership to bring to market a broad range of products -- Intel Xeon processors, Intel Nervana technology, Intel Movidius technology and Intel FPGAs -- that address the unique requirements of AI workloads from the edge to the data center and cloud.


Intel introduces an experimental 'self-learning' chip to make robots smarter

Mashable

Called the "Intel Loihi test chip," the processor is what Intel calls a "neuromorphic chip," meaning it's designed to learn from its environment. "The Intel Loihi research test chip includes digital circuits that mimic the brain's basic mechanics, making machine learning faster and more efficient while requiring lower compute power," Michael Mayberry, managing director of Intel Labs, wrote in a statement. This could help computers self-organize and make decisions ... "This could help computers self-organize and make decisions based on patterns and associations." But Intel's approach is different in that the Loihi test chip is designed to work and learn locally on whatever machine it's inside of.


Practical AI for the enterprise: Getting past vendors blowing smoke

ZDNet

Virtually every enterprise software vendor is pushing AI -- machine learning, cognitive computing, deep learning, and related technologies -- as the ultimate set of technologies to change your business, life, and the world. To cut through the hype and address practical issues of deploying AI in the enterprise, I invited Tiger Tyagarajan, president and CEO of Genpact, a professional services company with almost $3 billion in annual revenue, to appear on episode 246 of CXOTalk. Genpact's emerging focus is on embedding smart cognitive applications into process chains and workflows; it's about learning from mistakes and gaining new experiences along the way. The conversation with Tiger Tyagarajan focuses on practical aspects of deploying AI.


Pseudo-labeling a simple semi-supervised learning method - Data, what now?

@machinelearnbot

In this post, I will show how a simple semi-supervised learning method called pseudo-labeling that can increase the performance of your favorite machine learning models by utilizing unlabeled data. First, train the model on labeled data, then use the trained model to predict labels on the unlabeled data, thus creating pseudo-labels. In competitions, such as ones found on Kaggle, the competitor receives the training set (labeled data) and test set (unlabeled data). Pseudo-labeling allows us to utilize unlabeled data while training machine learning models.


Increasing equity through educational technology

MIT News

"New technologies have tremendous potential to improve student learning," Reich says, "but many pieces in a complex system need to be working seamlessly to make this happen." Through his work as executive director at the MIT Teaching Systems Lab, which now straddles CMS/W and the Office of Digital Learning, Reich works toward finding educational models that incorporate technology in ways that actually will increase quality of education and equity for students. "All over the world, people are looking to see a shift in classroom teaching practice to more active, engaged, inquiry-based collaborative learning," he says. Reich has also created learning tools for teachers through two online courses, Launching Innovation in Schools, done in collaboration with Peter Senge of the Sloan School of Management; and Design Thinking for Leading and Learning.