"Many researchers … speculate that the information-processing abilities of biological neural systems must follow from highly parallel processes operating on representations that are distributed over many neurons. [Artificial neural networks] capture this kind of highly parallel computation based on distributed representations"
– from Machine Learning (Section 4.1.1; page 82) by Tom M. Mitchell, McGraw Hill Companies, Inc. (1997).
Three institutions working together have applied DeepMind's neural network learning system to the task of discovering and diagnosing eye diseases. Moorfields Eye Hospital has been working with Google's DeepMind Health subsidiary and University College London in the effort, and have documented their progress in a paper published in Nature Medicine. As the researchers note, eye doctors currently use a machine that carries out optical coherence tomography (OCT) on patients to find out if they have an eye disease. While the technique is quite useful and accurate, it requires highly trained doctors to spend time looking at results. The researchers suggest this creates a backlog that sometimes prevents patients from getting the care they need in time to save their vision.
Only 7 percent of patients live five years after diagnosis of pancreatic cancer, the lowest rate for any cancer, according to the American Cancer Society. Elliot K. Fishman, MD, a researcher and radiologist at Johns Hopkins, is on the forefront of trying to change this statistic, and he's using artificial intelligence to do it. Fishman aims to spot pancreatic cancers far sooner than humans alone can by applying GPU-accelerated deep learning artificial intelligence to the task. Johns Hopkins is suited to developing a deep learning system because it has the massive amounts of data on pancreatic cancer needed to teach a computer to detect the disease in a CT scan. Hospital researchers also have NVIDIA's DGX-1 AI Supercomputer.
Artificial intelligence (AI) is proving to be the most significant technological advancement across all industries in recent decades. While we're still years away from the robotics side of AI, the machine learning (ML) sector has exploded by helping companies with everything from improving customer retention rates to driving enhanced insights from big data and even mitigating supply chain risks. With the global machine learning market anticipated to grow from $1.4B in 2017 to $8.8B by 2022 according to a recent report by Research and Markets, here's a look at where those investments are headed and what it means for increasingly in-demand machine learning talent. Last year, Amazon introduced us all to Alexa in the workplace, but this voice-activated, AI-powered device is only the beginning. Natural language processing (NLP), made possible through machine learning, helps computers, systems, and solutions better understand the context and meaning of sentences.
Intel has an ambition to bring more artificial intelligence technology into all aspects of its business, and today is stepping up its game a little in the area with an acquisition. The computer processing giant has acquired Vertex.AI, a startup that had a mission of making it possible to develop "deep learning for every platform", and had built a deep learning engine called PlaidML to do this. Terms of the deal have not been disclosed but Intel has provided us with the following statement, confirming the deal and that the whole team -- including founders Choong Ng and Brian Retford -- will be joining Intel. "Intel has acquired Vertex.AI, a Seattle-based startup focused on deep learning compilation tools and associated technology. The seven-person Vertex.AI team joined the Movidius team in Intel's Artificial Intelligence Products Group.
The team achieved a peak rate between 11.73 and 15.07 petaflops (single-precision) when running its data set on the Cori supercomputer. Machine learning, a form of artificial intelligence, enjoys unprecedented success in commercial applications. However, the use of machine learning in high performance computing for science has been limited. Why? Advanced machine learning tools weren't designed for big data sets, like those used to study stars and planets. A team from Intel, National Energy Research Scientific Computing Center (NERSC), and Stanford changed that.
Deep learning is much more like the human brain than is machine learning. Consider the way your brain interprets faces, for example. Your conscious self recognizes the whole face as a distinct person by interpreting the relationships between the parts at an astounding pace. You can't label each relationship it has identified, or even quantify and write out the variables your brain is interpreting. These things happen without your knowledge, so to speak.
Artificial intelligence can diagnose eye disease as accurately as some leading experts, research suggests. A study by Moorfields Eye Hospital in London and the Google company DeepMind found that a machine could learn to read complex eye scans and detect more than 50 eye conditions. Doctors hope artificial intelligence could soon play a major role in helping to identify patients who need urgent treatment. They hope it will also reduce delays. A team at DeepMind, based in London, created an algorithm, or mathematical set of rules, to enable a computer to analyse optical coherence tomography (OCT), a high resolution 3D scan of the back of the eye.
In a policy recommendation passed this year by the American Medical Association, the organization lauds the potential of artificial intelligence in healthcare. Combining AI with human clinicians can advance care delivery "in a way that outperforms what either can do alone," the AMA says. While such technology has been considerably hyped in recent years, the organization understands that with tempered expectations -- and deployed in the right situations -- AI can have a real impact on the industry. SIGN UP: Get more news from the HealthTech newsletter in your inbox every two weeks! Many believe one of AI's biggest impact areas will be radiology.
Supercomputer manufacturer Cray has introduced a new set of four artificial intelligence (AI) products to accelerate the adoption of deep learning in science and enterprise. The new products include Cray Accel AI Lab, which aims to advance the development of deep learning technologies and workflows, and Cray Accel AI Offerings, featured with NVIDIA Tesla V100 GPU accelerators. The new Cray Urika-XC software suite, which brings graph analytics, deep learning, and big data analytics tools the Cray XC supercomputers, will now include the TensorFlow computational framework and enhancements to the Cray software environment that are particularly designed to accelerate machine learning frameworks. Also included is a collaboration agreement with Intel. The Cray-Intel team up will deliver a productised software stack for deep learning at scale on Cray systems and leverage Intel's AI technologies to advance the state-of-the-art in distributed deep learning and machine learning.