Artificial intelligence (AI) has significantly advanced in the past half decade and is making major inroads across many industries and sectors worldwide. Earlier this month, Stanford University released The One Hundred Year Study on Artificial Intelligence (AI100) 2021 Study Panel Report. The new Stanford AI100 report is the second in a series following the inaugural AI100 report published five years ago in September 2016. Stanford plans to continue to publish the A1100 report once every five years for a hundred years or longer. "The field of artificial intelligence has made remarkable progress in the past five years and is having real-world impact on people, institutions and culture," the researchers wrote.
Ronen Lavi and Shay Perera have spent years working to develop and deploy AI in one of the most demanding environments in the world--the elite intelligence units of the Israel Defense Forces (IDF). Lavi established and led the AI Lab of Israel's Military Intelligence and Perera served there as manager of machine learning and computer vision research and development. After being awarded a National Security Award in 2018, they left the IDF to launch a startup, as many Israelis with similar experience and skills have done before them. The rapid digital transformation of the healthcare industry worldwide, the proliferation of healthcare data, the increasing complexity of healthcare (including its administration), the dearth of qualified personnel--and the Covid pandemic--have all contributed to a rising demand for AI solutions, intended to assist with detection, diagnosis, treatment, preventive care and wellness. The wealth of data that is produced by digitized medical records is what modern AI approaches (deep learning) require so they can "learn" from examples, automate certain decisions, and provide a helping hand to physicians and healthcare staff.
Using a raised eyebrow or smile, people with speech or physical disabilities can now operate their Android-powered smartphones hands-free, Google said Thursday. Two new tools put machine learning and front-facing cameras on smartphones to work detecting face and eye movements. Users can scan their phone screen and select a task by smiling, raising eyebrows, opening their mouth or looking to the left, right or up. "To make Android more accessible for everyone, we're launching new tools that make it easier to control your phone and communicate using facial gestures," Google said. The Centers for Disease Control and Prevention estimates that 61 million adults in the United States live with disabilities, which has pushed Google and rivals Apple and Microsoft to make products and services more accessible to them.
Researchers and data scientists at UT Southwestern Medical Center and MD Anderson Cancer Center have developed an artificial intelligence technique that can identify which cell surface peptides produced by cancer cells called neoantigens are recognized by the immune system. The pMTnet technique, detailed online in Nature Machine Intelligence, could lead to new ways to predict cancer prognosis and potential responsiveness to immunotherapies. "Determining which neoantigens bind to T cell receptors and which don't has seemed like an impossible feat. But with machine learning, we're making progress," said senior author Dr. Tao Wang, Ph.D., Assistant Professor of Population and Data Sciences, and with the Harold C. Simmons Comprehensive Cancer Center and the Center for Genetics of Host Defense at UT Southwestern. Mutations in the genome of cancer cells cause them to display different neoantigens on their surfaces.
Signals in the brains of birds have been read by scientists, in a breakthrough that could help develop prostheses for humans who have lost the ability to speak. In the study silicon implants recorded the firing of brain cells as male adult zebra finches went through their full repertoire of songs. Feeding the brain signals through artificial intelligence allowed the team from the University of California San Diego to predict what the birds would sing next. The breakthrough opens the door to new devices that could be used to turn the thoughts of people unable to speak, into real, spoken words for the first time. Current state-of-the-art implants allow the user to generate text at a speed of about 20 words per minute, but this technique could allow for a fully natural'new voice'.
The FDA has authorized the first artificial intelligence software to help doctors detect prostate cancer. The program, called Paige Prostate, is the first approved AI system in pathology. "We really believe this product can make a huge difference," Paige CEO Leo Grady, PhD, says. The program was approved to help doctors, not to replace them. "For a second opinion today, you ship a glass slide to somebody else or you do another stain that's really expensive or you do another molecular test," Grady says.
"Pathologists examine biopsies of tissue suspected for diseases, such as prostate cancer, every day. Identifying areas of concern on the biopsy image can help pathologists make a diagnosis that informs the appropriate treatment," said Tim Stenzel, M.D., Ph.D., director of the Office of In Vitro Diagnostics and Radiological Health in the FDA's Center for Devices and Radiological Health. "The authorization of this AI-based software can help increase the number of identified prostate biopsy samples with cancerous tissue, which can ultimately save lives." Cancer that starts in the prostate is called prostate cancer. According to the Centers for Disease Control and Prevention, aside from non-melanoma skin cancer, prostate cancer is the most common cancer among men in the United States.
For the past 10 years, Sonia Grego has been thinking about toilets – and more specifically what we deposit into them. "We are laser-focused on the analysis of stool," says the Duke University research professor, with all the unselfconsciousness of someone used to talking about bodily functions. "We think there is an incredible untapped opportunity for health data. And this information is not tapped because of the universal aversion to having anything to do with your stool." As the co-founder of Coprata, Grego is working on a toilet that uses sensors and artificial intelligence to analyse waste; she hopes to have an early model for a pilot study ready within nine months.
Many healthcare organizations are working on advancing virtual care options. That said, the path to ... [ ] digital healthcare transformation is bumpy. Covid-19 forced healthcare organizations to make a huge leap forward in their digital transformation roadmaps. Although many companies had plans on the books to advance telemedicine, the crisis revealed that virtual care is not only possible but in many cases is also preferred by patients. Virtual care also offers an opportunity to enhance patient experience, improve population health, reduce costs and improve the work life of health care providers -- the quadruple aim of healthcare. "To improve patient outcomes, we have to involve a patient's family and significant others in their care, because it must extend beyond the four walls of the hospital," says Dr. Reetu Singh, senior medical director, clinical documentation integrity at AdventHealth.
This is the first post in an intended series on what is the current state of Artificial Intelligence capabilities, and what we can expect in the relative short term. I will be at odds with the more outlandish claims that are circulating in the press, and amongst what I consider an alarmist group that includes people in the AI field and outside of it. In this post I start to introduce some of the key components of my future arguments, as well as show how different any AI system might be from us humans. Some may recognize the title of this post as an homage to the 1974 paper by Thomas Nagel, "What Is It Like to Be a Bat?". Two more recent books, one from 2009 by Alexandra Horowitz on dogs, and one from 2016 by Peter Godfrey-Smith on octopuses also pay homage to Nagel's paper each with a section of a chapter titled "What it is like", and "What It's Like", respectively, giving affirmative responses to their own questions about what is it like to be a dog, or an octopus.