Interpretability is a key factor in the design of automatic classifiers for medical diagnosis. Deep learning models have been proven to be a very effective classification algorithm when trained in a supervised way with enough data. The main concern is the difficulty of inferring rationale interpretations from them. Different attempts have been done in last years in order to convert deep learning classifiers from high confidence statistical black box machines into self-explanatory models. In this paper we go forward into the generation of explanations by identifying the independent causes that use a deep learning model for classifying an image into a certain class. We use a combination of Independent Component Analysis with a Score Visualization technique. In this paper we study the medical problem of classifying an eye fundus image into 5 levels of Diabetic Retinopathy. We conclude that only 3 independent components are enough for the differentiation and correct classification between the 5 disease standard classes. We propose a method for visualizing them and detecting lesions from the generated visual maps.
It's fun to complain that powerful Artificial General Intelligence (Hal-AI), the kind destined to enslave us, hasn't yet cured cancer. But, focusing too much on what Hal-AI can't yet do makes it easy to overlook the accomplishments of what Practical Artificial Intelligence (Siri-AI) can. For example, consider a recent article by Dr. Dave Levin, former CMIO for the Cleveland Clinic. He claims that AI currently offers little of value to healthcare, "Chronic diseases like diabetes and hypertension… recognizing and treating acute conditions like sepsis, heart attacks and strokes… better prenatal care, prevention and wellness. This is where the vast burden of illness, suffering and costs lie... AI likely has little to offer here of immediate value and can divert resources and attention from these harder (and frankly less sexy) needs."
Two years after originally announcing it, Medtronic and IBM Watson have launched their joint platform the Sugar.IQ, a digital diabetes assistant. "It is designed for people who are currently using Guardian Connect; so made for people on multiple daily injections. It is a personal assistant a little bit like Alexa or Siri," Huzefa Neemuchwala, global head of digital health solutions and AI at Medtronic, said in a Facebook live informational session. "It is an intelligent assistant that keeps track of all of your information and has all of your information in one place. Then through Watson technology we use this information to power insights so we can better manage your diabetes so that you can spend more time in range."
Treatment recommendations within Clinical Practice Guidelines (CPGs) are largely based on findings from clinical trials and case studies, referred to here as research studies, that are often based on highly selective clinical populations, referred to here as study cohorts. When medical practitioners apply CPG recommendations, they need to understand how well their patient population matches the characteristics of those in the study cohort, and thus are confronted with the challenges of locating the study cohort information and making an analytic comparison. To address these challenges, we develop an ontology-enabled prototype system, which exposes the population descriptions in research studies in a declarative manner, with the ultimate goal of allowing medical practitioners to better understand the applicability and generalizability of treatment recommendations. We build a Study Cohort Ontology (SCO) to encode the vocabulary of study population descriptions, that are often reported in the first table in the published work, thus they are often referred to as Table 1. We leverage the well-used Semanticscience Integrated Ontology (SIO) for defining property associations between classes. Further, we model the key components of Table 1s, i.e., collections of study subjects, subject characteristics, and statistical measures in RDF knowledge graphs. We design scenarios for medical practitioners to perform population analysis, and generate cohort similarity visualizations to determine the applicability of a study population to the clinical population of interest. Our semantic approach to make study populations visible, by standardized representations of Table 1s, allows users to quickly derive clinically relevant inferences about study populations.
Understanding changes in patient experiences is an important facet towards enhancing patient engagement and centricity. It's clear that new digital devices and AI are changing the way that patients search for medical information on the internet, based on discussions at the recent PanAgora's Pharma Customer Experience Summit in New Jersey. Voice is the new digital experience. With the rise of novel Voice User Interface (VUI) household products, such as Amazon Alexa and Google Home and VUI integration in smart devices, Murray Izenwasser, VP of Digital Transformation and CMO of AAJ Technologies, discussed the big opportunity such devices offer to better connect patients with the biopharmaceutical industry. Izenwasser demonstrated that VUI device adoption has seen a faster market penetration rate compared to smart phones, TV, radio, and the Internet.