Goto

Collaborating Authors

 deep infomax


Establishing Deep InfoMax as an effective self-supervised learning methodology in materials informatics

Moran, Michael, Gusev, Vladimir V., Gaultois, Michael W., Antypov, Dmytro, Rosseinsky, Matthew J.

arXiv.org Artificial Intelligence

The scarcity of property labels remains a key challenge in materials informatics, whereas materials data without property labels are abundant in comparison. By pretraining supervised property prediction models on self-supervised tasks that depend only on the "intrinsic information" available in any Crystallographic Information File (CIF), there is potential to leverage the large amount of crystal data without property labels to improve property prediction results on small datasets. We apply Deep InfoMax as a self-supervised machine learning framework for materials informatics that explicitly maximises the mutual information between a point set (or graph) representation of a crystal and a vector representation suitable for downstream learning. This allows the pretraining of supervised models on large materials datasets without the need for property labels and without requiring the model to reconstruct the crystal from a representation vector. We investigate the benefits of Deep InfoMax pretraining implemented on the Site-Net architecture to improve the performance of downstream property prediction models with small amounts (<10^3) of data, a situation relevant to experimentally measured materials property databases. Using a property label masking methodology, where we perform self-supervised learning on larger supervised datasets and then train supervised models on a small subset of the labels, we isolate Deep InfoMax pretraining from the effects of distributional shift. We demonstrate performance improvements in the contexts of representation learning and transfer learning on the tasks of band gap and formation energy prediction. Having established the effectiveness of Deep InfoMax pretraining in a controlled environment, our findings provide a foundation for extending the approach to address practical challenges in materials informatics.


Prediction of Progression to Alzheimer's disease with Deep InfoMax

Fedorov, Alex, Hjelm, R Devon, Abrol, Anees, Fu, Zening, Du, Yuhui, Plis, Sergey, Calhoun, Vince D.

arXiv.org Machine Learning

Arguably, unsupervised learning plays a crucial role in the majority of algorithms for processing brain imaging. A recently introduced unsupervised approach Deep InfoMax (DIM) is a promising tool for exploring brain structure in a flexible non-linear way. In this paper, we investigate the use of variants of DIM in a setting of progression to Alzheimer's disease in comparison with supervised AlexNet and ResNet inspired convolutional neural networks. As a benchmark, we use a classification task between four groups: patients with stable, and progressive mild cognitive impairment (MCI), with Alzheimer's disease, and healthy controls. Our dataset is comprised of 828 subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI) database. Our experiments highlight encouraging evidence of the high potential utility of DIM in future neuroimaging studies.


Google Brain, Microsoft plumb the mysteries of networks with AI ZDNet

#artificialintelligence

We live in an age of networks. From the social graph of Facebook to the interactions of proteins in the body, more and more of the world is being conceived of and represented as the connections in a network. And understanding of those connections can sometimes have stunning business implications, such as when Larry Page and Sergey Brin of Stanford University first proposed modeled networks of webpages, called "PageRank," the foundation of Google. Some heavy hitters in artificial intelligence have been working on ways to make machine learning techniques smarter about understanding networks. Late last week, a group of those researchers reported progress in having a neural network figure out the structure of a various networks without having full knowledge of all of a network.