EETimes - Will Machines Ever Fully Understand What They Are Seeing?

#artificialintelligence 

Embedded vision technologies are giving machines the power of sight, but today's systems still fall short of understanding all the nuances of an image. An approach used for natural language processing could address that. Attention-based neural networks, particularly transformer networks, have revolutionized natural language processing (NLP), giving machines a better understanding of language than ever before. This technique, which is designed to mimic cognitive processes by giving an artificial neural network an idea of history or context, has produced much more sophisticated AI agents than older approaches that also employ memory, such as long short-term memory (LSTM) and recurrent neural networks (RNNs). NLP now has a deeper level of understanding of the questions or prompts it is fed and can create long pieces of text in response that are often indistinguishable from what a human might write.