Goto

Collaborating Authors

Results


Improving Drug Safety With Adverse Event Detection Using NLP

#artificialintelligence

Don't miss our upcoming virtual workshop with John Snow Labs, Improve Drug Safety with NLP, to learn more about our joint NLP solution accelerator for adverse drug event detection. The World Health Organization defines pharmacovigilance as "the science and activities relating to the detection, assessment, understanding and prevention of adverse effects or any other medicine/vaccine-related problem." While all medicines and vaccines undergo rigorous testing for safety and efficacy in clinical trials, certain side effects may only emerge once these products are used by a larger and more diverse patient population, including people with other concurrent diseases. To support ongoing drug safety, biopharmaceutical manufacturers must report adverse drug events (ADEs) to regulatory agencies, such as the US Food and Drug Administration (FDA) in the United States and the European Medicines Agency (EMA) in the EU. Adverse drug reactions or events are medical problems that occur during treatment with a drug or therapy.


FDA Convenes Medical Device Workshop Focused on Artificial Intelligence and Machine …

#artificialintelligence

… Drug Administration (“FDA” or the “Agency”) held a virtual workshop entitled, Transparency of Artificial Intelligence (“AI”)/Machine Learning.


FDA Convenes Medical Device Workshop Focused on Artificial Intelligence and Machine Learning Transparency

#artificialintelligence

On October 14, 2021, the U.S. Food and Drug Administration ("FDA" or the "Agency") held a virtual workshop entitled, Transparency of Artificial Intelligence ("AI")/Machine Learning ("ML")-enabled Medical Devices. The workshop builds upon previous Agency efforts in the AI/ML space. Back in 2019, FDA issued a discussion paper and request for feedback called, Proposed Regulatory Framework for Modifications to AI/ML-Based Software as a Medical Device ("SaMD"). To support continued framework development and to increase collaboration and innovation between key stakeholders and specialists, FDA created the Digital Health Center of Excellence in 2020. And, in January 2021, FDA published an AI/ML Action Plan, based, in part, on stakeholder feedback to the 2019 discussion paper.


Medtechs need strategy to prevent bias in AI-machine learning-based devices: FDA

#artificialintelligence

Jeff Shuren, director of the FDA's Center for Devices and Radiological Health, on Thursday called out the need for better methodologies for identification and improvement of algorithms prone to mirroring "systemic biases" in the healthcare system and the data used to train artificial intelligence and machine learning-based devices, speaking at an FDA public workshop on the topic. The medical device industry should develop a strategy to enroll racially and ethnically diverse populations in clinical trials. "It's essential that the data used to train [these] devices represent the intended patient population with regards to age, gender, sex, race and ethnicity," Shuren said. The virtual workshop comes nine months after the agency released an action plan for establishing a regulatory approach to AI/ML-based Software as a Medical Device (SaMD). Among the five actions laid out in the plan, FDA intends to foster a patient-centered approach that includes device transparency for users.