Updated AWS Deep Learning AMIs: New Versions of TensorFlow, Apache MXNet, Keras, and PyTorch


The AMIs also come with improved framework support for NVIDIA Volta. They include PyTorch v0.3.0, and support NVIDIA CUDA 9 and cuDNN 7, with significant performance improvements for training models on NVIDIA Volta GPUs. As well, they include a version of TensorFlow built from the master and merged with NVIDIA processors for Volta support. We've also added Keras 2.0 support on the CUDA 9 version of the AWS Deep Learning AMIs to work with TensorFlow as the default backend.

Medical Imaging Drives GPU Accelerated Deep Learning Developments


Although most recognize GE as a leading name in energy, the company has steadily built a healthcare empire over the course of decades, beginning in the 1950s in particular with its leadership in medical X-ray machines and later CT systems in the 1970s and today, with devices that touch a broad range of uses. Much of GE Healthcare's current medical device business is rooted in imaging hardware and software systems, including CT imaging machines and other diagnostic equipment. The company has also invested significantly in the drug discovery and production arena in recent years--something the new CEO of GE, John Flannery (who previously led the healthcare division at GE), identified as one of three main focal points for GE's financial future. According to Flannery, the company's healthcare unit has one million scanners in service globally, which generate 50,000 scans every few moments. As one might imagine, this kind of volume will increasingly require more processing and analysis capabilities cooked in--something the company is seeking to get ahead with in today's partnership with Nvidia.

NVIDIA Researchers Showcase Major Advances in Deep Learning at NIPS NVIDIA Blog


AI has become part of the public consciousness. Researchers and data scientists have been sharing their groundbreaking work -- at what is officially known as the Conference and Workshop on Neural Information Processing Systems -- for three decades. But it's only with the recent explosion of interest in deep learning that NIPS has really taken off. We had two papers accepted to the conference this year, and contributed to two others. The researchers involved are among the 120 people on the NVIDIA Research team focused on pushing the boundaries of technology in machine learning, computer vision, self-driving cars, robotics, graphics, computer architecture, programming system, and other areas.

TITAN V: Now NVIDIA is talking deep-learning horsepower


This is a graphics card created for the PC. VentureBeat's Blair Frank said "The new Titan V card will provide customers with a Nvidia Volta chip that they can plug into a desktop computer." Thursday marked its debut, positioned as "the world's most powerful GPU for the PC." CEO Jensen Huang did the introduction. The announcement took place at the annual AI gathering, the NIPS (Neural Information Processing Systems) conference. It can carry massive amounts of power and speed AI computation.

Titan V and Nvidia's bleeding-edge Volta GPU: 5 things PC gamers need to know


Seven long months after the next-generation "Volta" graphics architecture debuted in the Tesla V100 for data centers, the Nvidia Titan V finally brings the bleeding-edge tech to PCs in traditional graphics card form. But make no mistake: This golden-clad monster targets data scientists, with a tensor core-laden hardware configuration designed to optimize deep learning tasks. You won't want to buy this $3,000 GPU to play Destiny 2. But that doesn't mean we humble PC gamers can't glean information from Volta's current AI-centric incarnations. Here are five key things you need to know about the Titan V and Nvidia's Volta GPU. Editor's note: This article was originally published on May 11, 2017 but was updated on December 8 to include information from the Titan V.

NVIDIA's 'most powerful GPU' ever is built for AI


NVIDIA's newest Titan GPU is now available for purchase, and the company says it's the "world's most powerful GPU for the PC" yet. The GPU-maker has launched the Volta-powered Titan V at the annual Neural Information Processing Systems conference. Volta is NVIDIA's latest microarchitecture designed to double the energy efficiency of its predecessor, and Titan V can apparently deliver 110 teraflops of raw horsepower or around 9 times what the previous Titan is capable of. Since Volta was designed to work on a mixture of computation and calculations and has features created specifically for deep learning, scientists can use the GPU to build their own desktop PCs if they don't need special servers. "Our vision for Volta was to push the outer limits of high performance computing and AI.

Nvidia Stands Out in Management Top 250 Rankings

Wall Street Journal

Nvidia Corp. NVDA 1.44% has been a force among videogame fans for more than a decade. Now the rest of the world is catching on, as the premier maker of chips that paint scenes of on-screen adventure and mayhem emerges as the kingpin in hardware for artificial intelligence. Sales of Nvidia chips to internet giants like Microsoft Corp. and Facebook Inc. --which rely on AI to do things like automatic image labeling and language translation--have grown by triple digits, year over year, for six quarters straight. Investors have responded by driving up Nvidia's stock roughly sevenfold in the past two years, lately trading at about 45 times earnings, compared with an industry average around 17. The market is rewarding not only the company's dominance in graphics and AI but also its extraordinarily well-balanced operations. In the Drucker Institute's Management Top 250 ranking of the most effectively managed U.S. companies, Nvidia's overall score of 76.8, which puts it in the top 10, is based on strong scores in all five categories that contribute to the overall ranking--customer satisfaction, employee engagement and development, innovation, social responsibility and financial strength.

IBM's now serving chips for AI


INSTANCES of artificial intelligence (AI), machine learning (ML), or deep learning are appearing across all sorts of enterprise service offerings. While there's a certain amount of bandwagon-jumping and overuse of the terms to grab headlines, machine-learning (et al) implementations are becoming quite the norm. Combined with a rise in the numbers of massive public networks of computing power (hyperscale data centers) offering everything-as-a-service (XaaS) from the cloud, it's no surprise that the big enterprise-level server vendors are responding with AI-centric technologies. The first into the fray is IBM, which has announced a new microprocessing chip and a server powered by it, the Power9 and the AC922 respectively. The chip is optimized for the particular demands of AI computation: in tests, it runs workloads on common AI frameworks such as Chainer and TensorFlow at four times the speed of existing systems.

How To Keep Your Job Regardless Of AI

International Business Times

Nvidia deep learning consultant Michelle Gill never imagined herself working in California's robot-crazed tech industry. When she left Nebraska and got a PhD in biochemistry and biophysics at Yale University, she saw herself as more of a scientist who studied life than a technologist prepared to build new creations. It wasn't until she started working at the National Cancer Institute that she first became interested in machine learning. Analyzing medical images with data science opened the door to a whole new world. "A lot of the concepts I had learned in science applied in some way to machine learning," Gill told Newsweek at the Artificial Intelligence & Data Science conference in New York City.

Where AI Is Headed: 13 Artificial Intelligence Predictions for 2018 NVIDIA Blog


Publications like The Wall Street Journal, Forbes and Fortune have all called 2017 "The Year of AI." AI outperformed professional gamers and poker players in new realms. Access to deep learning education expanded through various online programs. The speech recognition accuracy record was broken multiple times, most recently by Microsoft. And research universities and organizations like Oxford, Massachusetts General Hospital and GE's Avitas Systems invested in deep learning supercomputers. These are a few of many milestones in 2017.