I have been telling people for a while that ConceptNet is a valuable source of information for semantic vectors, or "word embeddings" as they've been called since the neural-net people showed up in 2013 and renamed everything. Let's call them "word vectors", even though they can represent phrases too. The idea is to compute a vector space where similar vectors represent words or phrases with similar meanings.
The interesting thing about Tensorflow, is that when you are writing in python, you are really only design a graph for the compiler to compile into C code and then run on either your CPU or GPU. Instead of having to write at the C or CUDA level, you can code it all in python first. The difficulty comes in actually understanding how to properly set up a neural network, convolutional network, etc. A lot of questions come into play, which type of model, what type of data regularization do you think is best, what level of data dropout or robustness do you want and are you going to purchase GPUs from Nvidia or try to make it work on CPUs?
For example, deep learning, a subset of AI and machine learning, requires massive amounts of data and computational power, but until recently, few viable use cases existed for deep learning, largely due to a lack of data. Today, we're beginning to see deep learning technology take hold, thanks to the massive amounts of data that IoT devices create. In the coming months, the relationship between IoT and AI will become even more symbiotic – and we'll likely see AI, including deep learning, play a critical role in the next big thing for IoT.
This course will get you started in building your FIRST artificial neural network using deep learning techniques. Following my previous course on logistic regression, we take this basic building block, and build full-on non-linear neural networks right out of the gate using Python and Numpy. All the materials for this course are FREE.