Thanks to advances in artificial intelligence, computers can now assist doctors in diagnosing disease and help monitor patient vital signs from hundreds of miles away. Now, CU Boulder researchers are working to apply machine learning to psychiatry, with a speech-based mobile app that can categorize a patient's mental health status as well as or better than a human can. "We are not in any way trying to replace clinicians," says Peter Foltz, a research professor at the Institute of Cognitive Science and co-author of a new paper in Schizophrenia Bulletin that lays out the promise and potential pitfalls of AI in psychiatry. "But we do believe we can create tools that will allow them to better monitor their patients." Nearly one in five U.S. adults lives with a mental illness, many in remote areas where access to psychiatrists or psychologists is scarce.
With the help of computational pathology firm Paige, healthcare technology giant Royal Phillips is bringing clinical artificial intelligence to pathology laboratories to help improve a pathologist's workflow and treatment planning for patients. According to a joint news release Thursday, this strategic collaboration will first start with Paige Prostate to help pathologists quantify and characterize cancer in tissue samples and make precise and efficient diagnoses. The release noted the need for more advanced cancer diagnosis technology as the number of cancer cases rises. Glass slide-based laboratory workflows are being converted to digital using solutions like ones offered by Phillips. Once digital images are created, the CE-marked Paige Prostate software is applied automatically to detect and localize prostate cancer, providing pathologists with valuable information they can use to evaluate prostate biopsies.
In my work as a journalist I am lucky enough to meet some brilliant people and learn about exciting advances in technology - along with a few duds. But every now and then I come across something that resonates in a deeply personal way. So it was in October 2018, when I visited a company called Medopad, based high up in London's Millbank Tower. This medical technology firm was working with the Chinese tech giant Tencent on a project to use artificial intelligence to diagnose Parkinson's Disease. This degenerative condition affects something like 10 million people worldwide.
The last half decade has ushered in the era of humans interacting with technology through speech, with Amazon's Alexa, Apple's Siri, and Google's AI rapidly becoming ubiquitous elements of the human experience. But, while the migration from typing to voice has brought great convenience for some folks (and improved safety, in the case of people utilizing technology while driving), it has not delivered on its potential for the people who might otherwise stand to benefit the most from it: those of us with disabilities. For people with Down Syndrome, for example, voice-based control of technology offers the promise of increased independence – and even of some new, potentially life-saving products. Yet, for this particular group of people, today's voice-recognizing AIs pose serious problems, as a result of a combination of 3 factors: To address this issue, and as a step forward towards ensuring that people with health conditions that cause AIs to be unable to understand them are able to utilize modern technology, Google is partnering with the Canadian Down Syndrome Society; via an effort called Project Understood, Google hopes to obtain recordings of people with Down Syndrome reading simple phrases, and to use those recordings to help train its AI to understand the speech patterns common to those with Down Syndrome. This effort is an extension of Google's own Project Euphonia, which seeks to improve computers' abilities to understand diverse speech patterns including impaired speech, and, which, earlier this year, began an effort to train AIs to recognize communication from people with the neuro-degenerative condition ALS, commonly known as Lou Gehrig's Disease.
One of the leading forms of cancer is colorectal cancer (CRC), which is responsible for increasing mortality in young people. The aim of this paper is to provide an experimental modification of deep learning of Xception with Swish and assess the possibility of developing a preliminary colorectal polyp screening system by training the proposed model with a colorectal topogram dataset in two and three classes. The results indicate that the proposed model can enhance the original convolutional neural network model with evaluation classification performance by achieving accuracy of up to 98.99% for classifying into two classes and 91.48% for three classes. For testing of the model with another external image, the proposed method can also improve the prediction compared to the traditional method, with 99.63% accuracy for true prediction of two classes and 80.95% accuracy for true prediction of three classes.
Today at AWS re:Invent in Las Vegas, NFL commissioner Roger Goodell joined AWS CEO Andy Jassy on stage to announce a new partnership to use machine learning to help reduce head injuries in professional football. "We're excited to announce a new strategic partnership together, which is going to combine cloud computing, machine learning and data science to work on transforming player health and safety," Jassy said today. NFL football is a fast and violent sport involving large men. Injuries are a part of the game, but the NFL is hoping to reduce head injuries in particular, a huge problem for the sport. A 2017 study found that 110 out of 111 deceased NFL players had chronic traumatic encephalopathy (CTE).
A new study suggest that ability to convey kindness through facial expressions may have been a key factor in human evolution. The study was conducted by Matteo Zanella and a team of researchers at the University of Milan, and published this week in Science Advances. The team compared genetic data from human stem cells with samples from the remains of two Neanderthals and one Denisovan, a sister species to Neanderthals found in central Asia. They specifically focused on the BAZ1B gene, which has been connected to Williams-Beuren syndrome, a condition that causes people to develop wide mouths and small noses that give a generally kind and welcoming impression. The BAZ1B gene has also been associated with the evolution of two extra muscles in dogs that allow them to widen and narrow their eyes in expressive ways, something wolves aren't able to do.
Using artificial intelligence, FITIV PULSE can intelligently predict a user's rate of weight loss and provide curated activity and nutrition advice to help them reach their goals. This new feature is called FITIV Insights - making it easier than ever to interpret health and fitness data by displaying data trends and providing expert advice to help users create actionable fitness goals and receive objective measures of their progress. Founder Sylvio LeBlanc's early life was fraught with years of gaining and losing the same 20 pounds, over and over, without consistent and sustainable progress. "I developed FITIV for myself, primarily. I'm the kind of person that needs to know that what I'm doing is working. Seeing those numbers really kept me motivated and tracking my calories was the key to my success."