I would like to share paper/code of our latest work entitled "Self-Supervised Relational Reasoning for Representation Learning" that has been accepted at NeurIPS 2020. There are three key technical differences with contrastive methods like SimCLR: (i) the replacement of the reprojection head with a relation module, (ii) the use of a Binary Cross Entropy loss (BCE) instead of a contrastive loss, and (iii) the use of multiple augmentations instead of just two. In the GitHub repository we have also released some pretrained models, minimalistic code of the method, a step-by-step notebook, and code to reproduce the experiments. Abstract: In self-supervised learning, a system is tasked with achieving a surrogate objective by defining alternative targets on a set of unlabeled data. The aim is to build useful representations that can be used in downstream tasks, without costly manual annotation.
Microsoft won't renew the contracts for dozens of news production contractors working at MSN and plans to use artificial intelligence to replace them, several people close to the situation confirmed on Friday. The roughly 50 employees -- contracted through staffing agencies Aquent, IFG and MAQ Consulting -- were notified Wednesday that their services would no longer be needed beyond June 30. "Like all companies, we evaluate our business on a regular basis," a Microsoft spokesman said in a statement. "This can result in increased investment in some places and, from time to time, re-deployment in others. These decisions are not the result of the current pandemic."