Diagnostic Medicine


Artificial Intelligence Is Putting Ultrasound on Your Phone

#artificialintelligence

If Jonathan Rothberg has a superpower, it's cramming million-dollar, mainframe-sized machines onto single semiconductor circuit boards. The entrepreneurial engineer got famous (and rich) inventing the world's first DNA sequencer on a chip. And he's spent the last eight years sinking that expertise (and sizeable startup capital) into a new venture: making your smartphone screen a window into the human body. Last month, Rothberg's startup Butterfly Network unveiled the iQ, a cheap, handheld ultrasound tool that plugs right into an iPhone's lightning jack. You don't have to be a technician to use one--its machine learning algorithms guide the user to find what they might be looking for.


Artificial Intelligence Is Putting Ultrasound on Your Phone

#artificialintelligence

If Jonathan Rothberg has a superpower, it's cramming million-dollar, mainframe-sized machines onto single semiconductor circuit boards. The entrepreneurial engineer got famous (and rich) inventing the world's first DNA sequencer on a chip. And he's spent the last eight years sinking that expertise (and sizeable startup capital) into a new venture: making your smartphone screen a window into the human body. Last month, Rothberg's startup Butterfly Network unveiled the iQ, a cheap, handheld ultrasound tool that plugs right into an iPhone's lightning jack. You don't have to be a technician to use one--its machine learning algorithms guide the user to find what they might be looking for.


Artificial Intelligence Is Putting Ultrasound on Your Phone

WIRED

If Jonathan Rothberg has a superpower, it's cramming million-dollar, mainframe-sized machines onto single semiconductor circuit boards. The entrepreneurial engineer got famous (and rich) inventing the world's first DNA sequencer on a chip. And he's spent the last eight years sinking that expertise (and sizeable startup capital) into a new venture: making your smartphone screen a window into the human body. Last month, Rothberg's startup Butterfly Network unveiled the iQ, a cheap, handheld ultrasound tool that plugs right into an iPhone's lightning jack. You don't have to be a technician to use one--its machine learning algorithms guide the user to find what they might be looking for.


ge-healthcare-executive-sees-data-driven-medicine-present-tense

@machinelearnbot

Picture the hospital of the future replete with a NASA-like command center featuring scores of information screens and a radiology department that leverages AI technology to help improve diagnostic accuracy and deep-learning technology to ensure that radiology images are clear. This is the world of data-driven medicine that, sees -- not in a crystal ball but in the real world. "Those technologies are here now, and they are gaining steam," said Charles Koontz, CEO of GE Healthcare Digital and chief digital officer of GE Healthcare in an interview at GE Digital's Minds Machines event in San Francisco last week. A 2016 McKinsey study supports the notion that the healthcare sector is embracing digital transformation. The field has seen "some core change," according to McKinsey, basing that assessment on a survey of 10 verticals.



Artificial intelligence: Is this the future of early cancer detection?

#artificialintelligence

A new endoscopic system powered by artificial intelligence (AI) has today been shown to automatically identify colorectal adenomas during colonoscopy. The system, developed in Japan, has recently been tested in one of the first prospective trials of AI-assisted endoscopy in a clinical setting, with the results presented today at the 25th UEG Week in Barcelona, Spain. The new computer-aided diagnostic system uses an endocytoscopic image - a 500-fold magnified view of a colorectal polyp - to analyse approximately 300 features of the polyp after applying narrow-band imaging (NBI) mode or staining with methylene blue. The system compares the features of each polyp against more than 30,000 endocytoscopic images that were used for machine learning, allowing it to predict the lesion pathology in less than a second. Preliminary studies demonstrated the feasibility of using such a system to classify colorectal polyps, however, until today, no prospective studies have been reported.


Just What the Doctor Ordered: Smarter Systems for AI-Assisted Radiology The Official NVIDIA Blog

#artificialintelligence

The research team at the Center for Clinical Data Science (CCDS) today received the world's first purpose-built AI supercomputer from the all-new portfolio of NVIDIA DGX systems with Volta. In only eight months -- beginning in December when the center received NVIDIA's first generation DGX-1 AI supercomputer -- CCDS data scientists have successfully trained machines to "see" abnormalities and patterns in medical images. Now, having just received the world's first NVIDIA DGX-1 with Volta supercomputer and with an all-new DGX Station, the world's first personal AI supercomputer coming later this month, CCDS will build upon its groundbreaking research to develop a host of new training algorithms and bring the power of AI directly to doctors. The new DGX-1 with Volta delivers groundbreaking AI computing power three times faster than the prior DGX generation, providing the performance of up to 800 CPUs in a single system.


Scanning The Future, Radiologists See Their Jobs At Risk

NPR

He's sitting inside a dimly lit reading room, looking at digital images from the CT scan of a patient's chest, trying to figure out why he's short of breath. Health care companies like vRad, which has radiologists analyzing 7 million scans a year, provide data to partners that develop medical algorithms. Chief Medical Officer Eldad Elnekave says computers can detect diseases from images better than humans because they can multitask -- say, look for appendicitis while also checking for low bone density. Radiologist John Mongan is researching ways to use artificial intelligence in radiology.


Scanning The Future, Radiologists See Their Jobs At Risk

@machinelearnbot

He's sitting inside a dimly lit reading room, looking at digital images from the CT scan of a patient's chest, trying to figure out why he's short of breath. 'You need them working together' The reality is this: dozens of companies, including IBM, Google and GE, are racing to develop formulas that could one day make diagnoses from medical images. Health care companies like vRad, which has radiologists analyzing 7 million scans a year, provide data to partners that develop medical algorithms. Chief Medical Officer Eldad Elnekave says computers can detect diseases from images better than humans because they can multitask -- say, look for appendicitis while also checking for low bone density.


The UK desperately needs a Radiology AI Incubator – Hugh Harvey – Medium

#artificialintelligence

In the UK academic circuit there are dozens of medical imaging researchers building algorithms on small datasets, but they lack the resources to test them on millions of images, let alone get their product into the market. What is needed is the alignment of big technology companies, the RCR and the NHS governing bodies to drive a fully collaborative vision in the field of radiology AI. We should be capitalising on the NHS as a national system, by pooling imaging data and building a nationalised imaging warehouse and technology incubator (I'd like to call this BRAIN -- British Radiology Artificial Intelligence Network). This would create a national institute for radiology in AI, capable of attracting industry partners, funding for researchers and equipment.