According to a Japanese study, Artificial Intelligence is getting closer to becoming a mind-reading machine. Japanese scientists have produced an algorithm that can read minds of human beings with a rather creepy correctness, according to a recent study available on bioRxiv. Besides that, this isn't the first attempt at reading minds. However, the previous efforts were much simpler based on deconstructing pictures with two main characteristics: Their pixels and basic shape. Our brain processes visual information by hierarchically extracting different levels of features or components of different. These neural networks or AI models can be used as a proxy for the hierarchical structure of the human brain, said Yukiyasu Kamitani, one of the scientists in the study. Throughout numerous tests, artificial intelligence was capable of interpreting electrical signals coming from the brain. Moreover, it defined which images were perceiving or imagining each subject. In favor of accomplishing this complex task, the system contains a set of artificial neural networks. They are capable of thinking like a human brain through various simulations. The neural network examined 50 distinctive images and the result of the magnetic resonances collected from each observer's brain. Eventually, it learned how to decipher the human thoughts. This algorithm, through reconstructing the human brain, managed to define the images they observed, between owls, showcases, red mailboxes and airplanes. Furthermore, it generated images such as swans, leopards, bowling balls or fish that each person imagined. Whereas it has long been thought that the externalization or visualization of states of the mind is a challenging goal in neuroscience, brain decoding using machine learning analysis of fMRI activity nowadays has enabled the visualization of perceptual content, said the research paper.