IBM Watson Health has formed a medical imaging collaborative with more than 15 leading healthcare organizations. The goal: To take on some of the most deadly diseases. The collaborative, which includes health systems, academic medical centers, ambulatory radiology providers and imaging technology companies, aims to help doctors address breast, lung, and other cancers; diabetes; eye health; brain disease; and heart disease and related conditions, such as stroke. Watson will mine insights from what IBM calls previously invisible unstructured imaging data and combine it with a broad variety of data from other sources, such as data from electronic health records, radiology and pathology reports, lab results, doctors' progress notes, medical journals, clinical care guidelines and published outcomes studies. As the work of the collaborative evolves, Watson's rationale and insights will evolve, informed by the latest combined thinking of the participating organizations.
A deep learning algorithm can detect metastases in sections of lymph nodes from women with breast cancer; and a deep learning system (DLS) has high sensitivity and specificity for identifying diabetic retinopathy, according to two studies published online December 12 in the Journal of the American Medical Association.
A deep learning algorithm can detect metastases in sections of lymph nodes from women with breast cancer; and a deep learning system (DLS) has high sensitivity and specificity for identifying diabetic retinopathy, according to two studies published online Dec. 12 in the Journal of the American Medical Association.
A scan of a human eye. SAN FRANCISCO -- Google plans to use more than one million anonymized eye scans to teach computers how to diagnose ocular disease. The Menlo Park, Calif.-based company has signed a deal with a British eye hospital to use artificial intelligence to learn from the medical records of 1.6 million patients in London hospitals. The goal is to teach a computer program to recognize the signs of two common types of eye disease, diabetic retinopathy and age-related macular degeneration. That's something humans are surprisingly imperfect at.
Hypoglycemia is common and potentially dangerous among those treated for diabetes. Electronic health records (EHRs) are important resources for hypoglycemia surveillance. In this study, we report the development and evaluation of deep learning-based natural language processing systems to automatically detect hypoglycemia events from the EHR narratives. Experts in Public Health annotated 500 EHR notes from patients with diabetes. We used this annotated dataset to train and evaluate HYPE, supervised NLP systems for hypoglycemia detection. In our experiment, the convolutional neural network model yielded promising performance $Precision=0.96 \pm 0.03, Recall=0.86 \pm 0.03, F1=0.91 \pm 0.03$ in a 10-fold cross-validation setting. Despite the annotated data is highly imbalanced, our CNN-based HYPE system still achieved a high performance for hypoglycemia detection. HYPE could be used for EHR-based hypoglycemia surveillance and to facilitate clinicians for timely treatment of high-risk patients.