Results


4 Strange New Ways to Compute

IEEE Spectrum Robotics Channel

With Moore's Law slowing, engineers have been taking a cold hard look at what will keep computing going when it's gone. Certainly artificial intelligence will play a role. But there are stranger things in the computing universe, and some of them got an airing at the IEEE International Conference on Rebooting Computing in November. There were also some cool variations on classics such as reversible computing and neuromorphic chips. But some less-familiar ones got their time in the sun too, such as photonics chips that accelerate AI, nano-mechanical comb-shaped logic, and a "hyperdimensional" speech recognition system.


The Complete Guide to TensorFlow 1.x - Udemy

@machinelearnbot

You will explore the main features and capabilities of TensorFlow such as a computation graph, data model, programming model, and TensorBoard. He is also the author of the book Building Machine Learning Projects with TensorFlow, Packt Publishing. Rezaul Karim has more than 8 years of experience in the area of research and development with a solid knowledge of algorithms and data structures in C/C, Java, Scala, R, and Python, focusing on Big Data technologies such as Spark, Kafka, DC/OS, Docker, Mesos, Zeppelin, Hadoop, and MapReduce, and deep learning technologies such as TensorFlow, DeepLearning4j, and H2O-Sparking Water. His research interests include machine learning, deep learning, semantic web/linked data, Big Data, and bioinformatics.


Four things you need to know about neural networks GovInsider

#artificialintelligence

"You see a lot of neural network research focusing on speech recognition, speech generation, vision, recognising faces, recognising images," Chun says. But it is important to note that as governments use citizen data to learn and extract knowledge, citizens must then ask themselves "what are the privacy laws, to what degree our personal data could be used in this manner", Chun adds. Singapore's Government Technology Agency has identified deep learning as a key focus for 2017. And a team from Northeastern University in China have recently developed a neural network that can identify the location of faulty signals in microgrids, which are smaller grids that are connected to the main power grid but can function without it.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


How AI Can Keep Accelerating After Moore's Law

MIT Technology Review

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Sundar Pichai Sees Google's Future in the Smartest Cloud

#artificialintelligence

In other words, he hopes the new chip and the new service will set Google's cloud business apart from services offered by its main rivals, including Amazon and Microsoft, the unnamed competitive threat underlying his I/O keynote. Between its two AI labs--Google Brain, based at company headquarters in Silicon Valley, and DeepMind, a London AI startup Google purchased a little more than three years ago--Google is leading the new wave of artificial intelligence research and development so rapidly changing entire industries and economies. But the company believes cloud computing--where computing power is rented over the internet to businesses and software developers--could one day bring in far more. Google built its new chip as a better way of serving its own AI services, most notably Google Translate, says Jeff Dean, the uber-engineer who oversees Google Brain, the company's main AI lab.


Sundar Pichai Sees Google's Future in the Smartest Cloud

WIRED

In other words, he hopes the new chip and the new service will set Google's cloud business apart from services offered by its main rivals, including Amazon and Microsoft, the unnamed competitive threat underlying his I/O keynote. Between its two AI labs--Google Brain, based at company headquarters in Silicon Valley, and DeepMind, a London AI startup Google purchased a little more than three years ago--Google is leading the new wave of artificial intelligence research and development so rapidly changing entire industries and economies. But the company believes cloud computing--where computing power is rented over the internet to businesses and software developers--could one day bring in far more. Google built its new chip as a better way of serving its own AI services, most notably Google Translate, says Jeff Dean, the uber-engineer who oversees Google Brain, the company's main AI lab.


What's Machine Learning? It's Expensive, Slow and Exclusive -- For Now

#artificialintelligence

AI and NLP are two acronyms many in the world of chatbots toss around glibly, sometimes without understanding themselves what these terms mean. There's a third acronym that's an essential component beneath these two: ML, which stands for machine learning. Machine learning is a lot easier to explain in one tweet than AI or NLP: It's the process by which an advanced software system trains itself from a massive set of examples, rather than being explicitly programmed with rigid algorithms devised by human coders. Over time, it gets better and better as it acquires more data to train on. An ML system is still programmed with standard one-and-zero logic, but it's programmed to modify its behavior to meet specified goals based on patterns it discovers in the sample data.