Results


How to Develop AI on a Raspberry Pi With Google Colaboratory

#artificialintelligence

Last year Google partnered with the Raspberry Pi Foundation to survey users on what would be most helpful in bringing Google's artificial intelligence and machine learning tools to the Raspberry Pi. Now those efforts are paying off. Thanks to Colaboratory – a new open-source project from Google – engineers, researchers, and makers can now build and run machine learning applications on a simple single-board computer. Google has officially opened up its machine learning and data science workflow – making learning about machine learning or data analytics as easy as using a notebook and a Raspberry Pi. Google's Colaboratory is a research and education tool that can easily be shared via Google's Chrome web browser.


Get ready for the Internet of Battle Things warns US Army AI boffin

#artificialintelligence

As US Army researcher believes that wars will be fought with human soldiers commanding a team of'physical and cyber robots' to create a network of "Internet of Battle Things" in the future. "Internet of Intelligent Battle Things (IOBT) is the emerging reality of warfare," as AI and machine learning advances Alexander Kott, chief of the Network Science Division of the US Army Research Laboratory. He envisions a future where physical robots are able to fly, crawl, walk, or ride into battle. The robots as small as insects can be used as sensors, and the ones as big as large vehicles can carry troops and supplies. There will also be "cyber robots", basically autonomous programmes, used within computers and networks to protect communications, fact-check, relay information, and protect other electronic devices from enemy malware.


Teaching Autonomous Driving Using a Modular and Integrated Approach

arXiv.org Artificial Intelligence

Autonomous driving is not one single technology but rather a complex system integrating many technologies, which means that teaching autonomous driving is a challenging task. Indeed, most existing autonomous driving classes focus on one of the technologies involved. This not only fails to provide a comprehensive coverage, but also sets a high entry barrier for students with different technology backgrounds. In this paper, we present a modular, integrated approach to teaching autonomous driving. Specifically, we organize the technologies used in autonomous driving into modules. This is described in the textbook we have developed as well as a series of multimedia online lectures designed to provide technical overview for each module. Then, once the students have understood these modules, the experimental platforms for integration we have developed allow the students to fully understand how the modules interact with each other. To verify this teaching approach, we present three case studies: an introductory class on autonomous driving for students with only a basic technology background; a new session in an existing embedded systems class to demonstrate how embedded system technologies can be applied to autonomous driving; and an industry professional training session to quickly bring up experienced engineers to work in autonomous driving. The results show that students can maintain a high interest level and make great progress by starting with familiar concepts before moving onto other modules.


4 Strange New Ways to Compute

IEEE Spectrum Robotics Channel

With Moore's Law slowing, engineers have been taking a cold hard look at what will keep computing going when it's gone. Certainly artificial intelligence will play a role. But there are stranger things in the computing universe, and some of them got an airing at the IEEE International Conference on Rebooting Computing in November.


The Complete Guide to TensorFlow 1.x - Udemy

@machinelearnbot

Are you a data analyst, data scientist, or a researcher looking for a guide that will help you increase the speed and efficiency of your machine learning activities? If yes, then this course is for you! Google's brainchild TensorFlow, in its first year, has more than 6000 open source repositories online. It has helped engineers, researchers, and many others make significant progress with everything from voice/sound recognition to language translation and face recognition. It has also proved to be useful in the early detection of skin cancer and preventing blindness in diabetics.


How To Become A Machine Learning Engineer: Learning Path

#artificialintelligence

We will walk you through all the aspects of machine learning from simple linear regressions to the latest neural networks, and you will learn not only how to use them but also how to build them from scratch. Big part of this path is oriented on Computer Vision(CV), because it's the fastest way to get general knowledge, and the experience from CV can be simply transferred to any ML area. We will use TensorFlow as a ML framework, as it is the most promising and production ready. Learning will be better if you work on theoretical and practical materials at the same time to get practical experience on the learned material. Also if you want to compete with other people solving real life problems I would recommend you to register on Kaggle, as it could be a good addition to your resume.


How To Become A Machine Learning Engineer: Learning Path

#artificialintelligence

We will walk you through all the aspects of machine learning from simple linear regressions to the latest neural networks, and you will learn not only how to use them but also how to build them from scratch. Big part of this path is oriented on Computer Vision(CV), because it's the fastest way to get general knowledge, and the experience from CV can be simply transferred to any ML area. We will use TensorFlow as a ML framework, as it is the most promising and production ready. Learning will be better if you work on theoretical and practical materials at the same time to get practical experience on the learned material. Also if you want to compete with other people solving real life problems I would recommend you to register on Kaggle, as it could be a good addition to your resume.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

Google CEO Sundar Pichai was obviously excited when he spoke to developers about a blockbuster result from his machine-learning lab earlier this month. Researchers had figured out how to automate some of the work of crafting machine-learning software, something that could make it much easier to deploy the technology in new situations and industries. But the project had already gained a reputation among AI researchers for another reason: the way it illustrated the vast computing resources needed to compete at the cutting edge of machine learning. A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). They told MIT Technology Review that the project had tied up hundreds of the chips for two weeks solid--making the technique too resource-intensive to be more than a research project even at Google.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").