nabla
Leveraging the clinician's expertise with agentic AI
This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review's editorial staff. How ambient AI assistants are supporting clinicians to save time, reduce burnout, and enhance treatment, restoring the doctor-patient experience. For many clinicians, administration is a whole job on its own. From examination findings to proposed treatments, test results, and patient education, clinicians must maintain accurate, clear, and timely clinical records every step of the way.
Artificial intelligence transforms patient care and reduces burnout, physician says
With just one click, the AI technology begins transcribing the doctor's conversation with a patient. DENVER – Artificial intelligence is quietly transforming how doctors interact with patients -- and it might already be in use during your next visit to the doctor's office. Thousands of physicians across the country are using a form of AI called ambient listening, surveys show. This technology listens to conversations between doctors and patients, creates real-time transcriptions, and then compiles detailed clinical notes -- all without disrupting the flow of the appointment. Dr. Daniel Kortsch, associate chief of artificial intelligence and digital health at Denver Health, said that ambient listening technology has made a big difference since his practice began using it in fall 2024.
OpenAI's Transcription Tool Hallucinates. Hospitals Are Using It Anyway
On Saturday, an Associated Press investigation revealed that OpenAI's Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a "confabulation" or "hallucination" in the AI field. Upon its release in 2022, OpenAI claimed that Whisper approached "human level robustness" in audio transcription accuracy. However, a University of Michigan researcher told the AP that Whisper created false text in 80 percent of public meeting transcripts examined. Another developer, unnamed in the AP report, claimed to have found invented content in almost all of his 26,000 test transcriptions.
- North America > United States > Michigan (0.26)
- North America > United States > Virginia (0.06)
- North America > United States > Minnesota (0.06)
- North America > United States > California > Los Angeles County > Los Angeles (0.06)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (0.98)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.87)
Nabla opens a health tech stack for patient engagement – TechCrunch
After setting out to examine digital healthcare from the inside by launching its own women's health clinic as an app last year, French startup Nabla is executing the next step in a planned pivot to b2b -- announcing today that it's opened its machine learning tech stack to other digital health businesses and healthcare providers so they can offer what it bills as "personalized medicine". Nabla's AI-powered patient communications and engagement/retention platform is designed to support clinicians to deliver a more continuous, data-driven service, whether the client is offering real-time telehealth consultations or delivering a service to patients via asynchronous, text-based messaging. Nabla's messaging and teleconsultation communication modules sit as a layer atop the customer healthcare service, ingesting and structuring patient data -- with its machine learning software supporting clinicians with real-time prompts and visualizations, as well as offering ongoing patient outreach features to extend service provision. The startup argues its approach can improve medical outcomes by supporting healthcare professionals to be able to ask relevant questions during a consultation, based on the AI's ability to aggregate patient activity and surface contextually relevant data -- and afterwards, with features like automated transcription and by suggesting updates a clinician could make to a patient's medical file. It likens the platform's capabilities to having a really attentive family doctor who knows their patient's full medical history and situation -- and has a fault-less memory for all that detail.
- North America > United States (0.05)
- Europe > France (0.04)
- Africa (0.04)
Mock Patient Told to Kill Themselves by Open AI's GPT-3 Medical Chatbot -- AI Daily - Artificial Intelligence News
GPT-3 also known as Generative Pre-Trained Transformer 3 is an advanced autoregressive language model that uses deep learning to produce human-like text. A recent experience by the medical Paris-based firm, Nabla, specialising in healthcare technology used a cloud-hosted instance of GPT-3 to analyse queries by humans and produce a suitable answer to them. The bot was designed to ease the daily workload of doctors however the unpredictable nature of the software's responses made it inappropriate for interacting with patients in the real world; this was concluded after running a series of tests.
Medical chatbot using OpenAI's GPT-3 told a fake patient to kill themselves
We're used to medical chatbots giving dangerous advice, but one based on OpenAI's GPT-3 took it much further. If you've been living under a rock, GPT-3 is essentially a very clever text generator that's been making various headlines in recent months. Only Microsoft has permission to use it for commercial purposes after securing exclusive rights last month. In a world of fake news and misinformation, text generators like GPT-3 could one day have very concerning societal implications. Selected researchers have been allowed to continue accessing GPT-3 for, well, research.
- North America > United States > California (0.06)
- Europe > Netherlands > North Holland > Amsterdam (0.06)
- Health & Medicine (1.00)
- Media > News (0.57)
- Information Technology > Artificial Intelligence > Natural Language > Large Language Model (1.00)
- Information Technology > Artificial Intelligence > Natural Language > Chatbot (1.00)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks > Deep Learning > Generative AI (0.65)
Doctor GPT-3: hype or reality? - Nabla
You may have heard about GPT-3 this summer, the new cool kid on the AI block. GPT-3 came out of OpenAI, one of the top AI research labs in the world which was founded in late 2015 by Elon Musk, Sam Altman and others and later backed with a $1B investment from Microsoft. You've probably also heard about the ongoing AI revolution in healthcare, thanks to promising results in areas such as automated diagnosis, medical documentation and drug discovery, to name a few. Some have claimed that algorithms now outperform doctors on certain tasks and others have even announced that robots will soon receive medical degrees of their own! This can all sound far-fetched... but could this robot actually be GPT-3?
The "Godfather of AI" just trashed GPT-3
GPT-3, an advanced language-processing artificial intelligence algorithm developed by OpenAI, is really good at what it does -- churning out humanlike text. But Yann LeCun, the Chief AI Scientist at Facebook who's been called a "godfather of AI," trashed the algorithm in a Tuesday Facebook post, writing that "people have completely unrealistic expectations about what large-scale language models such as GPT-3 can do." LeCun cites a recent experiment by the medical AI firm NABLA, which found that GPT-3 is woefully inadequate for use in a healthcare setting because writing coherent sentences isn't the same as being able to reason or understand what it's saying. "It's entertaining, and perhaps mildly useful as a creative help," LeCun wrote. "But trying to build intelligent machines by scaling up language models is like [using] high-altitude airplanes to go to the Moon. You might beat altitude records, but going to the Moon will require a completely different approach."