If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Artificial intelligence (AI) is one of the signature issues of our time, but also one of the most easily misinterpreted. The prominent computer scientist Andrew Ng's slogan "AI is the new electricity"2 signals that AI is likely to be an economic blockbuster--a general-purpose technology3 with the potential to reshape business and societal landscapes alike. Just as electricity transformed almost everything 100 years ago, today I actually have a hard time thinking of an industry that I don't think AI will transform in the next several years.4 Such provocative statements naturally prompt the question: How will AI technologies change the role of humans in the workplaces of the future? An implicit assumption shaping many discussions of this topic might be called the "substitution" view: namely, that AI and other technologies will perform a continually expanding set of tasks better and more cheaply than humans, while humans will remain employed to perform those tasks at which machines ...
In recent years, entire industries have popped up that rely on the delicate interplay between human workers and automated software. Companies like Facebook work to keep hateful and violent content off their platforms using a combination of automated filtering and human moderators. In the medical field, researchers at MIT and elsewhere have used machine learning to help radiologists better detect different forms of cancer. What can be tricky about these hybrid approaches is understanding when to rely on the expertise of people versus programs. This isn't always merely a question of who does a task "better;" indeed, if a person has limited bandwidth, the system may have to be trained to minimize how often it asks for help.
Wikipedia defines artificial intelligence in healthcare as the use of complex algorithms and software to emulate human cognition in the analysis, interpretation and comprehension of complicated medical and healthcare data. This "emulation" is done in less time and at a fraction of the cost. Artificial intelligence in healthcare was valued at about $600 million in 2014 and is projected to reach $150 billion by 2026. Reinventing and reinvigorating healthcare through the use of artificial intelligence is happening predominantly through assisting in better diagnosis, better processes, drug development and robot-assisted surgery. In 2015 misdiagnosing illness and medical error accounted for 10% of all U.S. deaths.
Diagnosing emphysema and classifying its severity have long been more art than science. "Everybody has a different trigger threshold for what they would call normal and what they would call disease," said U. Joseph Schoepf, M.D., director of cardiovascular imaging for MUSC Health and assistant dean for clinical research in the Medical University of South Carolina College of Medicine. And until recently, scans of damaged lungs have been a moot point, he said. In the past, if you lost lung tissue, that was it. The lung tissue was gone, and there was very little you could do in terms of therapy to help patients.
When a young girl came to New York University (NYU) Langone Health for a routine follow-up, tests seemed to show that the medulloblastoma for which she had been treated a few years earlier had returned. The girl's recurrent cancer was found in the same part of brain as before, and the biopsy seemed to confirm medulloblastoma. With this diagnosis, the girl would begin a specific course of radiotherapy and chemotherapy. But just as neuropathologist Matija Snuderl was about to sign off on the diagnosis and set her on that treatment path, he hesitated. The biopsy was slightly unusual, he thought, and he remembered a previous case in which what was thought to be medulloblastoma turned out to be something else. So, to help him make up his mind, Snuderl turned to a computer.
At first, the images of lungs infected by the novel coronavirus were hard to come by. It was early in the pandemic, and Joseph Paul Cohen, a researcher at the University of Montreal, was trying to stockpile radiology scans to train an artificial intelligence model to recognize warning signs of severe illness. With so few images available, the work was next to impossible. But in recent weeks, the resurgence of Covid-19 in the U.S. and other hotspots has solved that problem, allowing him to amass hundreds of lung scans from clinical reports published around the world. "We're at a good number now," said Cohen.
AI was already having an impact in healthcare before COVID-19 came along. Now the impact of AI in healthcare is accelerating. A harbinger of the impact of AI on the spread of COVID-19 came on New Year's Eve for 2020, when the AI platform Blue Dot registered a clutter of unusual cases in Wuhan, China. The Toronto-based company uses natural language processing and machine learning to track, locate and report on infectious disease spread. It sends alerts to its clients, which include entities in health care, government, business and public health.
To develop a fully automated algorithm for spleen segmentation and to assess the performance of this algorithm in a large dataset. In this retrospective study, a three-dimensional deep learning network was developed to segment the spleen on thorax-abdomen CT scans. Scans were extracted from patients undergoing oncologic treatment from 2014 to 2017. A total of 1100 scans from 1100 patients were used in this study, and 400 were selected for development of the algorithm. For testing, a dataset of 50 scans was annotated to assess the segmentation accuracy and was compared against the splenic index equation. In a qualitative observer experiment, an enriched set of 100 scan-pairs was used to evaluate whether the algorithm could aid a radiologist in assessing splenic volume change. The reference standard was set by the consensus of two other independent radiologists. A Mann-Whitney U test was conducted to test whether there was a performance difference between the algorithm and the independent observer. The algorithm and the independent observer obtained comparable Dice scores (P .834) on the test set of 50 scans of 0.962 and 0.964, respectively. The radiologist had an agreement with the reference standard in 81% (81 of 100) of the cases after a visual classification of volume change, which increased to 92% (92 of 100) when aided by the algorithm.
July 16, 2020 – UC San Diego recently announced that its health radiologists and other physicians are now leveraging artificial intelligence (AI) to augment lung-imaging analysis in a clinical research study aimed at COVID-19 lung imaging analysis. The cause of death for most COVID-19 patients is pneumonia, which often requires long hospital stays in intensive care units and assistance breathing with ventilators. Last year, Albert Hsiao, MD, PhD, associate professor of radiology at University of California San Diego School of Medicine and radiologists at UC San Diego Health, and his team, developed a machine learning algorithm that allowed radiologists to use AI to enhance their own abilities to spot pneumonia on chest X-rays. The algorithm was trained with 22,000 notations by radiologists and overlays X-rays with color-coded maps that indicate pneumonia probability, researchers explained. "Pneumonia can be subtle, especially if it's not your average bacterial pneumonia, and if we could identify those patients early, before you can even detect it with a stethoscope, we might be better positioned to treat those at highest risk for severe disease and death," Hsiao said.
Expectations for customer service are higher today than a year ago, with the coronavirus pandemic fueling online shopping and challenging enterprise customer service operations, according to Customer Thermometer. That puts companies offering automation solutions in the right place at the right time. Directly of San Francisco, cofounded by Antony Brydon, Jean Tessier and Jeff Patterson, offers a platform to integrate into call centers and provide a mix of automation and human support. Directly recently added $11 million in funding to bring its total investor commitment to $66.8 million, according to Crunchbase. The Directly platform is trained by thousands of subject matter experts to analyze call center interactions and provide a degree of automation, according to a recent account in VentureBeat.