When Merdis Wells visited the diabetes clinic at the University Medical Center in New Orleans about a year ago, a nurse practitioner checked her eyes to look for signs of diabetic retinopathy, the most common cause of blindness. At her next visit, in February of this year, artificial intelligence software made the call. The clinic had just installed a system that's designed to identify patients who need follow-up attention. The Food and Drug Administration cleared the system -- called IDx-DR -- for use in 2018. The agency said it was the first time it had authorized the marketing of a device that makes a screening decision without a clinician having to get involved in the interpretation.
When Google DeepMind's AlphaGo shockingly defeated legendary Go player Lee Sedol in 2016, the terms artificial intelligence (AI), machine learning and deep learning were propelled into the technological mainstream. AI is generally defined as the capacity for a computer or machine to exhibit or simulate intelligent behaviour such as Tesla's self-driving car and Apple's digital assistant Siri. It is a thriving field and the focus of much research and investment. Machine learning is the ability of an AI system to extract information from raw data and learn to make predictions from new data. Deep learning combines artificial intelligence with machine learning.
It's a device that scans a patient's retina to diagnose potential issues in real time. How it works: After the retina is scanned, the images are then analyzed by DeepMind's algorithms, which return a detailed diagnosis and an "urgency score." It all takes roughly 30 seconds. The prototype system can detect a range of diseases, including diabetic retinopathy, glaucoma, and age-related macular degeneration. Most notably, it can do this as accurately as top eye specialists, DeepMind claims.
Artificial intelligence holds the promise of diagnosing eye diseases faster and more accuracy than physicians. It is possible that technology could replace some of the more routine eye examinations hat physicians perform. While this may be the case, a new study indicates that the most effective application of advanced technology is with physicians and algorithms working in unison to track and detect eye diseases. The research builds upon developments from Google AI, which had shown that Google's health algorithm works almost as well as human medics when screening patients for the common diabetic eye disease called diabetic retinopathy (retinal vascular disease). The new research sought to inquire whether the algorithm could do more than simply diagnose disease.
According to the United Nations, 1 billion people globally live with disabilities, and as many as 70 million of them live in India. In India, individuals with disabilities face barriers to success from nonexistent or inaccessible infrastructure, as well as prejudicial beliefs and discriminatory laws. With those challenges in mind, Kyle Keane, lecturer and research scientist in MIT's Department of Materials Science and Engineering, was invited to conduct a 2018 summer workshop in Chennai, India. He reached out to MIT-India, part of MIT International Science and Technology Initiatives (MISTI), for support in bringing a student with him. They not only agreed, but MIT-India Managing Director Mala Ghosh replied, "Why not bring an entire class?"
When Dr. Eric Topol joined an experiment on using artificial intelligence to get personalized nutrition advice, he was hopeful. For two weeks, Topol, a cardiologist at Scripps Research, dutifully tracked everything he ate, wore a sensor to monitor his blood-glucose levels, and even collected and mailed off a stool sample for an analysis of his gut microbiome. The diet advice he got back stunned him: Eat Bratwurst, nuts, danishes, strawberries, and cheesecake. "It was crazy stuff," Topol told me. Bratwurst and cheesecake are foods Topol generally shirks because he considers them "unhealthy."
Today, in these vision centers, technicians take eye scans and send them to doctors in Madurai for review. Automated diagnosis can streamline and expand the process, reaching more people in more places -- the kind of "McDonaldization" espoused by Dr. V. The technology still faces regulatory hurdles in India, in part because of the difficulty of navigating the country's bureaucracy. And though Google's eye system is now certified for use in Europe, it is still awaiting approval in the United States. Luke Oakden-Rayner, the director of medical imaging research at the Royal Adelaide Hospital in Australia, said these systems might even need new regulatory frameworks because existing rules weren't always sufficient. "I am not convinced that people care enough about the safety of these systems," he said.
Doctors can sometimes make a diagnosis when faced with cataracts and blurry eye scans. The Google system still struggles to do this. It is trained largely on clear, unobstructed images of the retina, though Google is exploring the use of lower-quality images. Even with this limitation, Dr. Kim said, the system can augment what doctors can do on their own. Aravind already operates small vision centers in many of the cities and villages surrounding Madurai.
The AI-powered, cloud-based system will be available for use by primary care providers. Over 30 million Americans have diabetes, and diabetic retinopathy--which occurs when blood sugar levels result in damage to retinal blood vessels--is considered mostly preventable. Still, it causes vision loss in tens of thousands of people each year and is the leading cause of blindness among working-age Americans. "Many patients with diabetes are not adequately screened for diabetic retinopathy since about 50 percent of them do not see their eye doctor on a yearly basis," Malvina Eydelman, MD, said in the FDA's official announcement. She serves as director of the Division of Ophthalmic, and Ear, Nose and Throat Devices at the agency's Center for Devices and Radiological Health.
A new study conducted by researchers from Genentech and Roche shows first-time proof that artificial intelligence can detect the severity of diabetic macular edema, which is a leading cause of blindness. On Monday, researchers from Genentech and its parent company Roche published the study, Deep Learning Predicts OCT Measures of Diabetic Macular Thickening From Color Fundus Photographs" in the journal, Investigative Ophthalmology & Visual Science. The study showed that artificial intelligence can be used to provide widespread, cost-effective eye screenings via telemedicine to assist ophthalmologists in improving vision outcomes for millions of people with diabetes who may not be getting regular eye exams. The article is the first to be published that is part of a Roche/Genentech's Ophthalmology Personalized Healthcare initiative. The initiative, Roche said in a statement, aims to combine meaningful large-scale data and AI technology to predict and prevent ocular conditions and ...