Collaborating Authors

Federated Contrastive Learning for Dermatological Disease Diagnosis via On-device Learning Artificial Intelligence

Deep learning models have been deployed in an increasing number of edge and mobile devices to provide healthcare. These models rely on training with a tremendous amount of labeled data to achieve high accuracy. However, for medical applications such as dermatological disease diagnosis, the private data collected by mobile dermatology assistants exist on distributed mobile devices of patients, and each device only has a limited amount of data. Directly learning from limited data greatly deteriorates the performance of learned models. Federated learning (FL) can train models by using data distributed on devices while keeping the data local for privacy. Existing works on FL assume all the data have ground-truth labels. However, medical data often comes without any accompanying labels since labeling requires expertise and results in prohibitively high labor costs. The recently developed self-supervised learning approach, contrastive learning (CL), can leverage the unlabeled data to pre-train a model, after which the model is fine-tuned on limited labeled data for dermatological disease diagnosis. However, simply combining CL with FL as federated contrastive learning (FCL) will result in ineffective learning since CL requires diverse data for learning but each device only has limited data. In this work, we propose an on-device FCL framework for dermatological disease diagnosis with limited labels. Features are shared in the FCL pre-training process to provide diverse and accurate contrastive information. After that, the pre-trained model is fine-tuned with local labeled data independently on each device or collaboratively with supervised federated learning on all devices. Experiments on dermatological disease datasets show that the proposed framework effectively improves the recall and precision of dermatological disease diagnosis compared with state-of-the-art methods.

Cross-Domain Federated Learning in Medical Imaging Artificial Intelligence

Federated learning is increasingly being explored in the field of medical imaging to train deep learning models on large scale datasets distributed across different data centers while preserving privacy by avoiding the need to transfer sensitive patient information. In this manuscript, we explore federated learning in a multi-domain, multi-task setting wherein different participating nodes may contain datasets sourced from different domains and are trained to solve different tasks. We evaluated cross-domain federated learning for the tasks of object detection and segmentation across two different experimental settings: multi-modal and multi-organ. The result from our experiments on cross-domain federated learning framework were very encouraging with an overlap similarity of 0.79 for organ localization and 0.65 for lesion segmentation. Our results demonstrate the potential of federated learning in developing multi-domain, multi-task deep learning models without sharing data from different domains.

Machine Learning and Deep Learning


There has been a steady rise in ML-powered AI application in industry sectors like preventive healthcare, banking, finance, and media. We have trained professionals who can provide ML based solution of various business problems as good as any.

600 companies investing in deep learning


Another 177 companies (level 2) are developing projects using deep learning with dedicated resources in staff. And more than 350 companies (level 1) are experimenting with deep learning in their labs. Given how early deep learning is as a technology, the majority of companies investing in deep learning are IT and software businesses. However, we discovered interesting champions in other industries that are adopting deep learning as well. Given that deep learning has early roots in image processing, it is exciting to see health care companies like Siemens Healthcare and GE Healthcare leading the pack, along with research institutions like the NIH and Lawrence Livermore National Labs.