Goto

Collaborating Authors

Patients aren't being told about the AI systems advising their care

#artificialintelligence

Since February of last year, tens of thousands of patients hospitalized at one of Minnesota's largest health systems have had their discharge planning decisions informed with help from an artificial intelligence model. But few if any of those patients has any idea about the AI involved in their care. That's because frontline clinicians at M Health Fairview generally don't mention the AI whirring behind the scenes in their conversations with patients. Unlock this article by subscribing to STAT Plus and enjoy your first 30 days free! STAT Plus is STAT's premium subscription service for in-depth biotech, pharma, policy, and life science coverage and analysis. Our award-winning team covers news on Wall Street, policy developments in Washington, early science breakthroughs and clinical trial results, and health care disruption in Silicon Valley and beyond.


Shielded from scrutiny, Epic algorithms deliver inaccurate information

#artificialintelligence

Several artificial intelligence algorithms developed by Epic Systems, the nation's largest electronic health record vendor, are delivering inaccurate or irrelevant information to hospitals about the care of seriously ill patients, contrasting sharply with the company's published claims, a STAT investigation found. Employees of several major health systems said they were particularly concerned about Epic's algorithm for predicting sepsis, a life-threatening complication of infection. The algorithm, they said, routinely fails to identify the condition in advance, and triggers frequent false alarms. Some hospitals reported a benefit for patients after fine-tuning the model, but that process took at least a year. Unlock this article by subscribing to STAT and enjoy your first 30 days free!


How the 'Religious Freedom Division' Threatens LGBT Health--and Science

WIRED

When Marci Bowers consults with her patients, no subject is off limits. A transgender ob/gyn and gynecologic surgeon in Burlingame, California, she knows how important it is that patients feel comfortable sharing their sexual orientation and gender identity with their doctor, trust and honesty being essential to providing the best medical care. But Bowers knows firsthand that the medical setting can be a challenging place for patients to be candid. That for LGBT people, it can even be dangerous.


App promises to improve pain management in dementia patients

#artificialintelligence

University of Alberta computing scientists are developing an app to aid health-care staff to assess and manage pain in patients suffering from dementia and other neurodegenerative diseases. "The challenge with understanding pain in patients with dementia is that the expressions of pain in these individuals are often mistaken for psychiatric problems," said Eleni Stroulia, professor in the Department of Computing Science and co-lead on the project. "So we asked, how can we use technology to better understand the pain of people with dementia?" Along with Stroulia, the project is led by Thomas Hadjistavropoulos at the University of Regina as part of AGE-WELL, one of Canada's Networks of Centres of Excellence. The app will serve to digitize a pen-and-paper observational checklist that past research has shown helps health-care workers such as nurses when assessing pain in their patients suffering from dementia.


How artificial intelligence could make clinical trials smarter

#artificialintelligence

Clinical trials have a dirty little secret. For all the careful work that goes into randomizing and blinding participants just so, the criteria that determine who can enter a trial can be unexpectedly arbitrary. Patients can be nixed because of age, lab values, medication history, and a laundry list of other factors that may not always be necessary. "It was certainly surprising to us that these clinical trial criteria designs are fairly ad hoc and quite anecdotal," said James Zou, who leads Stanford's Laboratory for Machine Learning, Genomics, and Health. The unwitting result can be that women, older patients, and people of color are excluded from studies at higher rates.