Overview


Python TensorFlow Tutorial - Build a Neural Network - Adventures in Machine Learning

#artificialintelligence

Google's TensorFlow has been a hot topic in deep learning recently. The open source software, designed to allow efficient computation of data flow graphs, is especially suited to deep learning tasks. It is designed to be executed on single or multiple CPUs and GPUs, making it a good option for complex deep learning tasks. In it's most recent incarnation – version 1.0 – it can even be run on certain mobile operating systems. This introductory tutorial to TensorFlow will give an overview of some of the basic concepts of TensorFlow in Python. These will be a good stepping stone to building more complex deep learning networks, such as Convolution Neural Networks and Recurrent Neural Networks, in the package. We'll be creating a simple three-layer neural network to classify the MNIST dataset. This tutorial assumes that you are familiar with the basics of neural networks, which you can get up to scratch with in the neural networks tutorial if required. To install TensorFlow, follow the instructions here. The code for this tutorial can be found in this site's GitHub repository. First, let's have a look at the main ideas of TensorFlow.


An Overview of Python Deep Learning Frameworks

#artificialintelligence

I recently stumbled across an old Data Science Stack Exchange answer of mine on the topic of the "Best Python library for neural networks", and it struck me how much the Python deep learning ecosystem has evolved over the course of the past 2.5 years. The library I recommended in July 2014, pylearn2, is no longer actively developed or maintained, but a whole host of deep learning libraries have sprung up to take its place. Each has its own strengths and weaknesses. We've used most of the technologies on this list in production or development at indico, but for the few that we haven't, I'll pull from the experiences of others to help give a clear, comprehensive picture of the Python deep learning ecosystem of 2017.


Datebook: A Saudi artist mulls terrorism, vivid abstractions and paintings that pick apart beauty standards

Los Angeles Times

This exhibition is devoted to silent film -- specifically, 60 original silent movie posters and a very rare "Silent Oscar" (one of only 15 ever awarded) from the private collection of Dwight Manley.


A framework for Industry 4.0 - welcome to the next industrial revolution

#artificialintelligence

We're surrounded by more and more connected devices we're calling the Internet of Things. We can turn our heating on from our phones on the commute home. Pegs can tell us when to bring the washing in so it doesn't get wet. Cars know the hazards ahead and warn us before we get there so that we can avoid them. Many of the'things' have been manufactured within the'Industrial Internet of Things' or'Industry 4.0'. But where did the 4.0 came from? What was 3.0, and why are you going to hear about it more and more? Welcome to the fourth industrial revolution!


Transfer Learning - Machine Learning's Next Frontier

#artificialintelligence

In recent years, we have become increasingly good at training deep neural networks to learn a very accurate mapping from inputs to outputs, whether they are images, sentences, label predictions, etc. from large amounts of labeled data.


[session] #IoT Security Certifications @ThingsExpo @PECB #M2M #Security

#artificialintelligence

In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), will provide an overview of various initiatives to certifiy the security of connected devices and future trends in ensuring public trust of IoT.


History as a guide to IoT growth trajectory

@machinelearnbot

Internet of Things (IoT) has generated a ton of excitement and furious activity. However, I sense some discomfort and even dread in the IoT ecosystem about the future – typical when a field is not growing at a hockey-stick pace . . .


Open issues in genetic programming

#artificialintelligence

It is approximately 50 years since the first computational experiments were conducted in what has become known today as the field of Genetic Programming (GP), twenty years since John Koza named and popularised the method, and ten years since the first issue appeared of the Genetic Programming & Evolvable Machines journal. In particular, during the past two decades there has been a significant range and volume of development in the theory and application of GP, and in recent years the field has become increasingly applied. There remain a number of significant open issues despite the successful application of GP to a number of challenging real-world problem domains and progress in the development of a theory explaining the behavior and dynamics of GP. These issues must be addressed for GP to realise its full potential and to become a trusted mainstream member of the computational problem solving toolkit.


What is complexity science? What are complex systems?

VideoLectures.NET

We are happy to announce that the Winter School on Complexity Science 2017 videos are now online!


This company is turning FAQs into Amazon Echo skills

PCWorld

People looking for an easier path to integrating with Amazon's Alexa virtual assistant have good news on the horizon. NoHold, a company that builds services for making bots, unveiled a project that seeks to turn a document into an Alexa skill.