Goto

Collaborating Authors

 uthealth


De-identification of clinical free text using natural language processing: A systematic review of current approaches

Kovačević, Aleksandar, Bašaragin, Bojana, Milošević, Nikola, Nenadić, Goran

arXiv.org Artificial Intelligence

Background: Electronic health records (EHRs) are a valuable resource for data-driven medical research. However, the presence of protected health information (PHI) makes EHRs unsuitable to be shared for research purposes. De-identification, i.e. the process of removing PHI is a critical step in making EHR data accessible. Natural language processing has repeatedly demonstrated its feasibility in automating the de-identification process. Objectives: Our study aims to provide systematic evidence on how the de-identification of clinical free text has evolved in the last thirteen years, and to report on the performances and limitations of the current state-of-the-art systems. In addition, we aim to identify challenges and potential research opportunities in this field. Methods: A systematic search in PubMed, Web of Science and the DBLP was conducted for studies published between January 2010 and February 2023. Titles and abstracts were examined to identify the relevant studies. Selected studies were then analysed in-depth, and information was collected on de-identification methodologies, data sources, and measured performance. Results: A total of 2125 publications were identified for the title and abstract screening. 69 studies were found to be relevant. Machine learning (37 studies) and hybrid (26 studies) approaches are predominant, while six studies relied only on rules. Majority of the approaches were trained and evaluated on public corpora. The 2014 i2b2/UTHealth corpus is the most frequently used (36 studies), followed by the 2006 i2b2 (18 studies) and 2016 CEGS N-GRID (10 studies) corpora.


Postdoctoral Fellow in Bioinformatics, Deep Learning

#artificialintelligence

The successful candidate is expected to join an established bioinformatics team. The ongoing projects in BSML focus on precision medicine, functional roles of genetic variants in complex disease, next-generation sequencing and single cell RNA sequencing method development and data analyses, deep learning, and regulatory networks. Integrative genomics and deep learning approaches are often applied. Funding (NIH grants, CPRIT, and lab/center startup) is available to support this position for 3 years and promotion to faculty positions is possible. The candidate will have the opportunity to access many high throughput datasets and interact with investigators across UTHealth and Texas Medical Center.


UTHealth to lead AI research project focused on stroke, diabetes

#artificialintelligence

We are developing novel statistical methods and machine learning approaches to interrogate EHR databases to identify the best treatment strategies and risk factors for a variety of diseases using real-world evidence,


Machine Learning Algorithm Helps Doctors Make Decisions in Stroke Management - Docwire News

#artificialintelligence

Researchers from the University of Texas Health Science Center at Houston (UTHealth) have recently created a machine learning algorithm that can help physicians in deciding how to treat a patient's stroke. The artificial intelligence (AI) driven technology is designed to assist doctors outside of major stroke treatment facilities in determining whether an ischemic stroke patient would benefit from undergoing an endovascular procedure that removes the blood clot. Their work was published online on September 24 in the journal Stroke. This procedure, endovascular thrombectomy, is performed to remove an arterial blood clot causing an ischemic stroke. It involves the insertion of a catheter into the femoral artery of the thigh, which is followed to the patient's brain where the clot must be removed.