If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
In November, Google researchers published a paper in JAMA showing that Google's deep learning algorithm, trained on a large data set of fundus images, can detect diabetic retinopathy with better than 90 percent accuracy. Just a couple of months ago, the company launched the Healthcare NExT initiative, which brings together artificial intelligence, cloud computing, research and industry partnerships. Last month, Alphabet-owned Verily launched the Project Baseline Study, a collaborative effort with Stanford Medicine and Duke University School of Medicine to amass a large collection of broad phenotypic health data in hopes of developing a well-defined reference of human health. "If the government did data quality and data sharing initiatives, it would be a lot different," Andrew Maas, chief scientist at Roam Analytics (a San Francisco-based machine learning analytics platform provider focused on life sciences) said at the Light Forum.
When IBM CEO Ginni Rometty delivered the opening keynote at HIMSS17 she effectively set the stage for artificial intelligence, cognitive computing and machine learning to be prevalent themes throughout the rest of the conference. Some 70 percent of respondents are either actively planning or researching artificial intelligence, cognitive computing and machine learning technologies -- while 7 percent are rolling them out and 1 percent have already completed an implementation. It's not entirely surprising that more respondents, 30 percent, are either rolling out or have completed a rollout of population health technologies, while 50 percent are either researching actively planning to do so. The overarching themes at the pre-conference HIMSS and Healthcare IT News Cloud Computing Forum on Sunday were that security is not a core competency of hospital and health systems, thus many cloud providers can better protect health data and the ability to spin up server, storage and compute resources on Amazon, Google or Microsoft is enabling a whole new era of innovation that simply is not possible when hospitals have to invest in their own infrastructure to run proofs-of-concept and pilot programs.
Reflecting the rapidly increasing interest and investment in cloud computing, 10,000 developers, engineers, IT executives, and Google employees and partners gathered at Next '17, Google's annual cloud event for enterprise customers. "Google has been a machine learning company for a long time, every one of its consumer-facing products has been powered by machine learning," Fei-Fei Li told me on the sidelines of the event. Fei-Fei Li and Jia Li, both leading experts in AI (specifically, in computer vision), joined Google last November to lead its newly created Cloud AI and Machine Learning group. It's the combination of the processing power and geographical reach of the cloud; the availability of Google's machine learning models and APIs; sharing with the world massive data sets; and transferring Google's AI skills and expertise to enterprise customers.
New advances in satellite observation, open data and machine learning now allow us to process the massive amounts of data being produced. For example, Global Fishing Watch uses satellite-based monitoring to track all fishing vessels in real-time to protect fisheries around the world. The project, ClimatePrediction.net, relies on volunteer computing to run advanced climate models that prove to be even too large to run on supercomputers. The results are better models that can help predict the future of Earth's climate and help understand how our oceans will cope with higher temperatures, acidification and other climate shocks.
Exploring the Artificially Intelligent Future of Finance With technological enhancements increasing computing power and decreasing its cost, easing access to big data and innovating algorithms, there has been a huge surge in interest of artificial intelligence, machine learning and its subset, deep learning, in recent years. What have been the leading factors enabling recent advancements and uptake of deep learning? Yuanyuan: Customer experience could be significantly improved using AI by analyzing individual level attributes to make traditional service much more tailor-made. Alesis: One of the main challenges for start-ups when applying Machine Learning specifically to financial services is educating the customers on the importance of data and access to it.
As for what we would call unsupervised learning--which is to say, we're not training it to process but it's beginning to learn on its own--that is moving more in the direction of what some consider true artificial intelligence, or even AGI: artificial general intelligence. But you can begin to understand extended cold spells, extended warm spells, droughts--and all of these things help with water management and agriculture. And I would say we'll see some of this in other areas--traffic systems, logistics systems, et cetera. In America, we have a national weather service, we have NOAA [National Oceanic and Atmospheric Administration], we've got a private sector offering forecasts.
FORTUNE: We hear a lot of terms on the AI front these days--"artificial intelligence," "machine learning," "deep learning," "unsupervised learning," and the one IBM uses to describe Watson: "cognitive computing." So it's a system between machine computing and humans interpreting, and we call those machine-human interactions cognitive systems. KENNY: It takes enormous, enormous amounts of computing power to do that because you've got to leave Watson running at all times, just like the human brain, and that's why I believe cloud computing has been such an important enabler here because prior to cloud computing--where you could access many machines at the same time--you were limited by a mainframe. Bob Picciano (left) of IBM with David Kenny (right) at the IBM Insight Conference in 2015.
More recently, research has suggested that quantum effects could offer similar advantages for the emerging field of quantum machine learning (a subfield of artificial intelligence), leading to more intelligent machines that learn quickly and efficiently by interacting with their environments. As quantum technologies emerge, quantum machine learning will play an instrumental role in our society--including deepening our understanding of climate change, assisting in the development of new medicine and therapies, and also in settings relying on learning through interaction, which is vital in automated cars and smart factories." In the new study, the researchers' main result is that quantum effects can help improve reinforcement learning, which is one of the three main branches of machine learning. But while in certain situations quantum effects have the potential to offer great improvements, in other cases classical machine learning likely performs just as well or better than it would with quantum effects.
In the movie Transcendence, Johnny Depp plays Dr Will Caster, a researcher in artificial intelligence at Berkeley trying to build a sentient computer. It is transforming how computers transcribe speech into text, recognise images, rank search results, and perform many other tasks that require intelligence. For instance, deep learning requires lots of data. It is sure to play a critical role in driving autonomous cars, ranking search results, recommending products, identifying spam email, trading stocks, and interpreting medical images.
"Sandia National Laboratories researchers are drawing inspiration from neurons in the brain, such as these green fluorescent protein-labeled neurons in a mouse neocortex, with the aim of developing neuro-inspired computing systems to reboot computing. "Summary: Researchers explore neural computing to extend Moore's Law. Sandia explores neural computing to extend Moore's Law. Historically, neural computing has been seen as approximate and fuzzy, he added; however, Sandia researchers in their papers aim to extend neural algorithms so they incorporate rigor and predictability, which shows they may have a role in high performance scientific computing.