Goto

Collaborating Authors

 ferryman


Data can be a 'force for evil,' AI and machine learning experts say

#artificialintelligence

The COVID-19 pandemic has highlighted and exacerbated existing disparities in the healthcare system, including the consequences of bias on racialized or marginalized groups. Some of the ways racial bias in the healthcare system emerge are more obvious, such as horror stories of Black people being turned away at emergency departments. Others, experts said during the HIMSS Machine Learning and AI for Healthcare Digital Summit this week, are less visible – but can still be incredibly harmful. "There are other ways this bias manifests structurally that are not as potentially sort of obvious," said Kadija Ferryman, industry assistant professor of ethics and engineering, NYU Tandon School of Engineering, at a panel on Tuesday. "That is through informatics and data."


Data can be a 'force for evil,' AI and machine learning experts say

#artificialintelligence

BEGIN ARTICLE PREVIEW: The COVID-19 pandemic has highlighted and exacerbated existing disparities in the healthcare system, including the consequences of bias on racialized or marginalized groups.Some of the ways racial bias in the healthcare system emerge are more obvious, such as horror stories of Black people being turned away at emergency departments. Others, experts said during the HIMSS Machine Learning and AI for Healthcare Digital Summit this week, are less visible – but can still be incredibly harmful. HIMSS20 Digital Learn on-demand, earn credit, find products and solutions. Get Started >> “There are other ways this bias manifests structurally that are not as potentially sort of obvious,” said Kadija Ferryman, industry assistant professor of ethics and engineering, NYU Tandon School of Engineering, at a panel on Tuesday. “That is through informatics and data.” For instance, COVID-19 is a disease that attacks


Ethical Machine Learning in Health Care

Chen, Irene Y., Pierson, Emma, Rose, Sherri, Joshi, Shalmali, Ferryman, Kadija, Ghassemi, Marzyeh

arXiv.org Artificial Intelligence

The use of machine learning (ML) in health care raises numerous ethical concerns, especially as models can amplify existing health inequities. Here, we outline ethical considerations for equitable ML in the advancement of health care. Specifically, we frame ethics of ML in health care through the lens of social justice. We describe ongoing efforts and outline challenges in a proposed pipeline of ethical ML in health, ranging from problem selection to post-deployment considerations. We close by summarizing recommendations to address these challenges.


A fairer way forward for AI in health care

#artificialintelligence

When data scientists in Chicago, Illinois, set out to test whether a machine-learning algorithm could predict how long people would stay in hospital, they thought that they were doing everyone a favour. Keeping people in hospital is expensive, and if managers knew which patients were most likely to be eligible for discharge, they could move them to the top of doctors' priority lists to avoid unnecessary delays. It would be a win–win situation: the hospital would save money and people could leave as soon as possible. Starting their work at the end of 2017, the scientists trained their algorithm on patient data from the University of Chicago academic hospital system. Taking data from the previous three years, they crunched the numbers to see what combination of factors best predicted length of stay.


What if AI in health care is the next asbestos? - STAT

#artificialintelligence

Artificial intelligence is often hailed as a great catalyst of medical innovation, a way to find cures to diseases that have confounded doctors and make health care more efficient, personalized, and accessible. But what if it turns out to be poison? Jonathan Zittrain, a Harvard Law School professor, posed that question during a conference in Boston Tuesday that examined the use of AI to accelerate the delivery of precision medicine to the masses. "I think of machine learning kind of as asbestos," he said. "It turns out that it's all over the place, even though at no point did you explicitly install it, and it has possibly some latent bad effects that you might regret later, after it's already too hard to get it all out."