Goto

Collaborating Authors

 radiographer


Exploring the Use of Social Robots to Prepare Children for Radiological Procedures: A Focus Group Study

Nigro, Massimiliano, Righini, Andrea, Spitale, Micol

arXiv.org Artificial Intelligence

Exploring the Use of Social Robots to Prepare Children for Radiological Procedures: A Focus Group Study Massimiliano Nigro 1, Andrea Righini 2 and Micol Spitale 1 Abstract -- When children are anxious or scared, it can be hard for them to stay still or follow instructions during medical procedures, making the process more challenging and affecting procedure results. This is particularly true for radiological procedures, where long scan times, confined spaces, and loud noises can cause children to move, significantly impacting scan quality. T o this end, sometimes children are sedated, but doctors are constantly seeking alternative non-pharmacological solutions. This work aims to explore how social robots could assist in preparing children for radiological procedures. We have conducted a focus group discussion with five hospital stakeholders, namely radiographers, paediatricians, and clinical engineers, to explore (i) the context regarding children's preparation for radiological procedures, hence their needs and how children are currently prepared, and (ii) the potential role of social robots in this process. The discussion was transcribed and analysed using thematic analysis. Among our findings, we identified three potential roles for a social robot in this preparation process: offering infotainment in the waiting room, acting as a guide within the hospital, and assisting radiographers in preparing children for the procedure. We hope that insights from this study will inform the design of social robots for pediatric healthcare. I NTRODUCTION Fear and anxiety can make it challenging for children to stay still and cooperate during medical procedures, in turn affecting the procedures' results.


NFRs in Medical Imaging

Vallentin, Amanda

arXiv.org Artificial Intelligence

The diagnostic imaging departments are under great pressure due to a growing workload. The number of required scans is growing and there is a shortage of qualified labor. AI solutions for medical imaging applications have shown great potential. However, very few diagnostic imaging models have been approved for hospital use and even fewer are being implemented at the hospitals. The most common reason why software projects fail is poor requirement engineering, especially non-functional requirements (NFRs) can be detrimental to a project. Research shows that machine learning professionals struggle to work with NFRs and that there is a need to adapt NFR frameworks to machine learning, AI-based, software. This study uses qualitative methods to interact with key stakeholders to identify which types of NFRs are important for medical imaging applications. The study was done on a single Danish hospital and found that NFRs of type Efficiency, Accuracy, Interoperability, Reliability, Usability, Adaptability, and Fairness were important to the stakeholders. Especially Efficiency since the diagnostic imaging department is trying to spend as little time as possible on each scan.


Challenges for Responsible AI Design and Workflow Integration in Healthcare: A Case Study of Automatic Feeding Tube Qualification in Radiology

Thieme, Anja, Rajamohan, Abhijith, Cooper, Benjamin, Groombridge, Heather, Simister, Robert, Wong, Barney, Woznitza, Nicholas, Pinnock, Mark Ames, Wetscherek, Maria Teodora, Morrison, Cecily, Richardson, Hannah, Pérez-García, Fernando, Hyland, Stephanie L., Bannur, Shruthi, Castro, Daniel C., Bouzid, Kenza, Schwaighofer, Anton, Ranjit, Mercy, Sharma, Harshita, Lungren, Matthew P., Oktay, Ozan, Alvarez-Valle, Javier, Nori, Aditya, Harris, Stephen, Jacob, Joseph

arXiv.org Artificial Intelligence

Nasogastric tubes (NGTs) are feeding tubes that are inserted through the nose into the stomach to deliver nutrition or medication. If not placed correctly, they can cause serious harm, even death to patients. Recent AI developments demonstrate the feasibility of robustly detecting NGT placement from Chest X-ray images to reduce risks of sub-optimally or critically placed NGTs being missed or delayed in their detection, but gaps remain in clinical practice integration. In this study, we present a human-centered approach to the problem and describe insights derived following contextual inquiry and in-depth interviews with 15 clinical stakeholders. The interviews helped understand challenges in existing workflows, and how best to align technical capabilities with user needs and expectations. We discovered the trade-offs and complexities that need consideration when choosing suitable workflow stages, target users, and design configurations for different AI proposals. We explored how to balance AI benefits and risks for healthcare staff and patients within broader organizational and medical-legal constraints. We also identified data issues related to edge cases and data biases that affect model training and evaluation; how data documentation practices influence data preparation and labelling; and how to measure relevant AI outcomes reliably in future evaluations. We discuss how our work informs design and development of AI applications that are clinically useful, ethical, and acceptable in real-world healthcare services.


Introduction to Artificial Intelligence for Radiographers

#artificialintelligence

AI in healthcare and medical imaging has developed rapidly over the last decade. This course presents the basic elements of Artificial Intelligence (AI) in the context of Radiography. It will offer you some background knowledge on all key contemporary AI topics and how these can affect your professional practice and workflow. This is one of the first AI courses designed specifically for the Radiography workforce, and is integral in understanding and managing future changes in practice as it covers all modalities of Radiography. This course is for recent radiography graduates, clinical practitioners, radiology managers, radiography researchers and educators who wish to further their understanding of the basic principles and applications of AI in Radiography and Medical Imaging. This is the first course of its kind for radiographers in the UK and Europe, and its key takeaway is the ability to understand and manage future changes in radiography practice.


Detecting the pulmonary trunk in CT scout views using deep learning

#artificialintelligence

For CT pulmonary angiograms, a scout view obtained in anterior–posterior projection is usually used for planning. For bolus tracking the radiographer manually locates a position in the CT scout view where the pulmonary trunk will be visible in an axial CT pre-scan. We automate the task of localizing the pulmonary trunk in CT scout views by deep learning methods. In 620 eligible CT scout views of 563 patients between March 2003 and February 2020 the region of the pulmonary trunk as well as an optimal slice (“reference standard”) for bolus tracking, in which the pulmonary trunk was clearly visible, was annotated and used to train a U-Net predicting the region of the pulmonary trunk in the CT scout view. The networks’ performance was subsequently evaluated on 239 CT scout views from 213 patients and was compared with the annotations of three radiographers. The network was able to localize the region of the pulmonary trunk with high accuracy, yielding an accuracy of 97.5% of localizing a slice in the region of the pulmonary trunk on the validation cohort. On average, the selected position had a distance of 5.3 mm from the reference standard. Compared to radiographers, using a non-inferiority test (one-sided, paired Wilcoxon rank-sum test) the network performed as well as each radiographer (P < 0.001 in all cases). Automated localization of the region of the pulmonary trunk in CT scout views is possible with high accuracy and is non-inferior to three radiographers.


MEDRADRESEARCH

#artificialintelligence

Our April blog is from Dr. Tracy O'Regan. Tracy is a diagnostic radiographer who works at the Society & College of Radiographers (SCoR) as officer for clinical imaging and research. She sits on the steering committee for UK Research & Innovation (UKRI) Science & Technology Facilities Council (STFC) Cancer Diagnosis Network, she is a member of NHSx AI Imaging Advisory Board, and she provides officer support for a SCoR AI & Emerging Technologies Working Party who are currently consulting on a guidance document with recommendations and priorities for AI for UK professionals. In 2011 Nilsson wrote a book that explored 50 years of the development of Artificial Intelligence (AI) (1). Nilsson described AI winters and a series of false dawns; each progressed the path to our current stage of AI with the promise of machine learning, neural networks and deep learning. Despite that development, in the main, clinical imaging and radiotherapy professionals are still discussing AI as if it is a new fashion or perhaps even the emperor's new clothes.


Agfa launches its SmartXR Assistant

#artificialintelligence

Agfa announced the launch of its SmartXR portfolio at RSNA, being held virtually. SmartXR uses a unique combination of hardware and AI-powered software to lighten radiographers' workloads and provide image acquisition support. This newest member of Agfa's DR portfolio offers key assistance during the radiology routine, which has proven to be very important during the COVID-19 crisis, as well as beyond. The SmartXR portfolio brings intelligence to digital radiography (DR) equipment at the point of care. Integrated sensors and cameras combined with powerful AI software, 3D machine vision, deep learning and machine intelligence, support the radiographer with first-time-right image acquisition.


The role of a therapy radiographer in the age of Artificial Intelligence (AI) – RadPro 365 Live

#artificialintelligence

Will the critical shortages of therapy radiographers mean that we are about to be replaced by AI, robots and machine learning systems and that will essentially solve the training, retention and employment problems in our profession by stealth? The Society of Radiographers have just announced that the new apprentice programs are now "GO" and where a more vocational training environment in combination with a prospective employer and a degree course will allow employers to "attract and select individuals they believe have the potential to become radiographers". It has also been announced this month that the University of Portsmouth is to close its degree course in radiotherapy and oncology in 2020 for which the timing is particularly ironic and may well impact on recruitment in the South further exacerbating the current problem. I looked at these issues in my January blog and reported on some items in the media relating to this. The College of Radiographers published some of their latest feedback and information on Radiographer Apprenticeships in my February blog and now having read some of the latest books on the impact of Artificial Intelligence on us and especially the workplace, I thought it would be interesting this month to see how this might impact on our profession.


Tech Giant Offers Wales Preview of New AI X-ray Unit - Business News Wales

#artificialintelligence

Global tech giant Fujifilm has given Welsh healthcare professionals an exclusive first look at its latest medical innovation ahead of the product's global clinical launch. The Japanese multinational photography and imaging company, which is a pioneer in medical imaging and diagnostics equipment, previewed its new artificial intelligence (AI) software, which is integrated into a mobile radiography system, at an event hosted by Life Sciences Hub Wales. Flown in from Tokyo for the event, the FDR nano is a mobile X-ray unit that uses integrated AI technology to quickly identify and flag abnormalities that need further investigation. The product is the first Fujifilm AI-enabled mobile unit in Europe and is due to commence clinical trials in a UK hospital. The AI in the unit highlights suspicious areas on an image to the radiographer taking the X-ray using a heat map.


New DeepMind AI 'spots breast cancer better than clinicians'

#artificialintelligence

A newly developed artificial intelligence (AI) model is able to spot breast cancer better than a clinician, new research has suggested. Google DeepMind, in partnership with Cancer Research UK Imperial Centre, Northwestern University and Royal Surrey County Hospital, has developed the model which can spot cancer in breast screening mammograms in a bid to improve health outcomes and ease pressure on overstretched radiology services. Initial findings, published by the technology giant in the journal Nature, suggest the AI can identify the disease with greater accuracy, fewer false positives and fewer false negatives. The model, trained on de-identified data of 76,000 women in the UK and more than 15,000 women in the US, reportedly lowered false positive results by 1.2% and false negatives by 2.7% in the UK, but is yet to be tested in clinical studies. When tested, the AI system processed only the latest available mammogram of a patient, whereas clinicians had access to patient histories and prior mammograms to make an informed screening decision.