If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
In the past few years, deep learning is all the fuss in the tech industry. To keep up on things I like to get my hands dirty implementing interesting network architectures I come across in article readings. Few months ago I came across a very nice article called Siamese Recurrent Architectures for Learning Sentence Similarity.It offers a pretty straightforward approach to the common problem of sentence similarity. Named MaLSTM ("Ma" for Manhattan distance), its architecture is depicted in figure 1 (diagram excludes the sentence preprocessing part). Notice that since this is a Siamese network, it is easier to train because it shares weights on both sides.
Few months ago I came across a very nice article called Siamese Recurrent Architectures for Learning Sentence Similarity which offers a pretty straightforward approach at the common problem of sentence similarity. Siamese network seem to perform good on similarity tasks and have been used for tasks like sentence semantic similarity, recognizing forged signatures and many more. Word embedding is a modern way to represent words in deep learning models, more about it can be found in this nice blog post. Inputs to the network are zero padded sequences of word indices, these inputs are vectors of fixed length, where the first zeros are being ignored and the non zeros are indices that uniquely identify words.