A Stanford University research team applied machine learning technology to health records, in order to help hospitals and hospices give better end-of-life care to the terminally ill. Researchers examined Electronic Health Record (EHR) data from Stanford Hospital and Lucile Packard Children's hospital. The data, which covered health history for around two million child and adult patients, was used to train a "neural network" that is now able to predict the mortality of people with serious or terminal illnesses. The idea is that by telling hospitals and hospices when patients are likely to die, end-of-life care can be prioritised in a more intelligent way. "We demonstrate that routinely collected EHR [electronic health record] data can be used to create a system that prioritises patients for follow up for palliative care," the Stanford researchers explain.
While the use of artificial intelligence to predict deaths may sound ludicrous, researchers are trying to establish the technology's potential in alerting physicians and medical professionals of patients that are at greater risks of dying in the near future. This way, doctors can administer the right end-of-life approach in dealing with the patients and their loved ones. A team at Stanford University has examined the use of artificial intelligence in palliative care in their paper "Improving Palliative Care with Deep Learning" published on the arXiv preprint server. Researchers used the machine learning technique called deep learning, which utilizes neural networks to filter and learn from massive data, in the study. What they did is come up with a model and fed its deep learning algorithm with data from the Electronic Health Records of 2 million adult and child patients admitted to either Stanford Hospital or Lucile Packard Children's hospital.
Artificial intelligence is starting to play a transformational role in the healthcare industry, even if opportunities for using it are just beginning to be explored. That's an initial finding of a new report from JASON, an independent group of scientists advising the federal government on science and technology issues. The Department of Health and Human Services and the Robert Wood Johnson Foundation commissioned the report; the names of the scientists who developed the report are not being released. Computers can match human competence in image recognition and, in some studies, can make diagnostic decisions on medical images that match or exceed the ability of clinicians. Technology is also getting better at speech recognition and natural language processing.
A new algorithm uses brain activity to create reconstructions (bottom two rows) of observed photos (top row). Imagine searching through your digital photos by mentally picturing the person or image you want. Or texting a loved one a sunset photo that was never captured on camera. A computer that can read your mind would find many uses in daily life, not to mention for those paralyzed and with no other way to communicate. Now, scientists have created the first algorithm of its kind to interpret--and accurately reproduce--images seen or imagined by another person.
The state of value-based reimbursement efforts has been uncertain. Many healthcare organizations are indeed pursuing newer strategies to replace traditional fee-for-service care while reducing costs and improving quality, but progress has often been halting. Still, experts from Cedars-Sinai, CVS Health, Blue Cross NC and Harvard Pilgrim Health Care say they're quite optimistic for the future of value-based care in 2018 and beyond. In the area of health IT, the shift to value-based care is fueling new uses for data and has the potential to reinvigorate the electronic health records that many feared had gone stale, said Scott Weingarten, senior vice president and chief clinical transformation officer at Cedars-Sinai and an innovator in the value-based care space. "I believe that natural language processing, machine learning and artificial intelligence have the potential to significantly improve the interpretation, understanding and usefulness of information documented in the electronic health records and other information sources," Weingarten said.
The source of the common hospital-acquired infection known as C. diff can be hard to pin down in a busy, sprawling hospital, where patients might pick up the bug in countless locations. Hospitals nationwide are eager to reduce C. diff infections. A few years ago, when the UCSF Medical Center set a priority to cut rates of the infection, the UCSF Health Informatics team pitched an unusual strategy: Digitally reconstructing each patient's footsteps in the hospital. The team realized that within each patient's electronic health record (EHR) was detailed information about every room each patient had stepped into for every test. Using these digital breadcrumbs mined from the records, the team was able to trace a significant source of infection back to one CT scan machine.
The center is only about a year old, but it has already built important capabilities. Its goal is not basic research, but improving clinical practice within the two hospitals and the healthcare system in general. According to the CCDS Executive Director, Dr. Mark Michalski, in order for this technology to actually affect care there are several key prerequisites: Industry partnerships: For-profit companies dominate both the medical technology and information technology industries, so it's important for a research center to have beneficial collaborations with external firms. Early in its short history, the CCDS established a ten-year collaboration with GE Healthcare, a major producer of medical imaging equipment that is now headquartered in Boston. This strategic partnership will focus on two major areas.
"What I love," says Alyssa Siefert, Engineering Director at Yale Center for Biomedical Innovation and Technology (CBIT), "is a democratization of problem solving." Siefert is one of the lead organizers of the Yale Healthcare Hackathon, an event in its fifth year that brings together a diverse group of clinicians, engineers, designers, patients and community members Jan 19-21 at Yale School of Medicine to come up with solutions to healthcare challenges. Last year, the event had representatives from eight countries and two dozen universities, and those numbers have been on the rise. About half the participants are non-Yale. The main sponsor of this year's event is 4Catalyzer, a Guilford, Connecticut-based accelerator founded by Dr. Jonathan Rothberg, who serves as its Chief Strategy Officer, for launching new biomedical startups with a heavy emphasis on medical devices, artificial intelligence and big data.
Artificial intelligence will soon change how we conduct our daily lives. Are companies prepared to capture value from the oncoming wave of innovation? Yes, they have a fine MRI machine and powerful software to generate the images. But that's where the machines bog down. The radiologist has to find and read the patient's file, examine the images, and make a determination.
Artificial Intelligence: Arterys AI has not had a bad year yet. Between breakthrough technologies and soaring funding rounds, there was no shortage of strong candidates to choose from in 2017. Ambra Health CEO Morris Panner, JD, gave the nod to Arterys. The 10-year-old San Francisco, California, company both started and ended 2017 in style. In January, it received a first-of-its-kind FDA approval for its cloud-based technology, which applies AI and deep learning to medical imaging analysis.