Results


Nvidia announces $2,999 Titan V, 'the most powerful PC GPU ever created'

@machinelearnbot

It seems like Nvidia announces the fastest GPU in history multiple times a year, and that's exactly what's happened again today; the Titan V is "the most powerful PC GPU ever created," in Nvidia's words. It represents a more significant leap than most products that have made that claim, however, as it's the first consumer-grade GPU based around Nvidia's new Volta architecture. That said, a liberal definition of the word "consumer" is in order here -- the Titan V sells for $2,999 and is focused around AI and scientific simulation processing. Nvidia claims up to 110 teraflops of performance from its 21.1 billion transistors, with 12GB of HBM2 memory, 5120 CUDA cores, and 640 "tensor cores" that are said to offer up to 9 times the deep-learning performance of its predecessor. Also it comes in gold and black, which looks pretty cool.


Updated AWS Deep Learning AMIs: New Versions of TensorFlow, Apache MXNet, Keras, and PyTorch

@machinelearnbot

The AMIs also come with improved framework support for NVIDIA Volta. They include PyTorch v0.3.0, and support NVIDIA CUDA 9 and cuDNN 7, with significant performance improvements for training models on NVIDIA Volta GPUs. As well, they include a version of TensorFlow built from the master and merged with NVIDIA processors for Volta support. We've also added Keras 2.0 support on the CUDA 9 version of the AWS Deep Learning AMIs to work with TensorFlow as the default backend.


NVIDIA Researchers Showcase Major Advances in Deep Learning at NIPS NVIDIA Blog

@machinelearnbot

AI has become part of the public consciousness. Researchers and data scientists have been sharing their groundbreaking work -- at what is officially known as the Conference and Workshop on Neural Information Processing Systems -- for three decades. But it's only with the recent explosion of interest in deep learning that NIPS has really taken off. We had two papers accepted to the conference this year, and contributed to two others. The researchers involved are among the 120 people on the NVIDIA Research team focused on pushing the boundaries of technology in machine learning, computer vision, self-driving cars, robotics, graphics, computer architecture, programming system, and other areas.


TITAN V: Now NVIDIA is talking deep-learning horsepower

@machinelearnbot

This is a graphics card created for the PC. VentureBeat's Blair Frank said "The new Titan V card will provide customers with a Nvidia Volta chip that they can plug into a desktop computer." Thursday marked its debut, positioned as "the world's most powerful GPU for the PC." CEO Jensen Huang did the introduction. The announcement took place at the annual AI gathering, the NIPS (Neural Information Processing Systems) conference. It can carry massive amounts of power and speed AI computation.


AI News: Artificial Intelligence Trends And Leading Stocks Stock News & Stock Market Analysis - IBD

@machinelearnbot

Investors beware: there's plenty buzz around artificial intelligence (AI) as more and more companies say they're using it. In some cases, companies are using older data analytics tools and labeling it as AI for a public relations boost. But identifying companies actually getting material revenue growth from AI can be tricky. X AI uses computer algorithms to replicate the human ability to learn and make predictions. AI software needs computing power to find patterns and make inferences from large quantities of data.


Global Automotive Artificial Intelligence Market Outlook 2017-2023 - The Big Three are Ford Motor Company, General Motors, and Fiato Chrysler Automotive - Research and Markets

@machinelearnbot

DUBLIN--(BUSINESS WIRE)--The "Automotive Artificial Intelligence - Global Market Outlook (2017-2023)" report has been added to Research and Markets' offering. The Global Automotive Artificial Intelligence Market accounted for $563.58 million in 2016 and is expected to reach $5,265.81 million by 2023 growing at a CAGR of 37.6% during the forecast period. The automotive industry has seen the promise of artificial intelligence (AI) technology, and is among the industries at the forefront of using AI to augment human actions and to mimic the actions of humans. The arrival of standards such as the adaptive cruise control (ACC), blind spot alert, and advanced driver assistance systems (ADAS) and rising demand for convenience and safety presents an opportunity for OEMs to build up novel and innovative artificial intelligence systems that would attract customers. Although 2016 was spoiled by some technological failures in self-driving cars, the year als-observed a couple of successful test runs in the US.


Nvidia's new AI creates disturbingly convincing fake videos

@machinelearnbot

Nvidia continues to pave a path towards the end of reality as we know it. The company recently made an AI capable of creating images of people out of thin air. Now, it's made one that can change the weather, turn day into night, and change a leopard's spots. AI researchers for the company developed an unsupervised learning method for computers which allows for sweeping changes to video content it's fed. By using the new method, they were able to produce startling results.


For HPC and Deep Learning, GPUs are here to stay - insideHPC

@machinelearnbot

In this special guest feature from Scientific Computing World, David Yip, HPC and Storage Business Development at OCF, provides his take on the place of GPU technology in HPC. There was an interesting story published earlier this week in which NVIDIA's founder and CEO, Jensen Huang, said: 'As advanced parallel-instruction architectures for CPU can be barely worked out by designers, GPUs will soon replace CPUs'. There are only so many processing cores you can fit on a single CPU chip. There are optimized applications that take advantage of a number of cores, but typically they are used for sequential serial processing (although Intel is doing an excellent job of adding more and more cores to its CPUs and getting developers to program multicore systems). By contrast, a GPU has massively parallel architecture consisting of many thousands of smaller, more efficient cores designed for handling multiple tasks simultaneously.


At least 16 companies developing Deep Learning chips NextBigFuture.com

@machinelearnbot

There are many established and startup companies developing deep learning chips. Google and Wave Computing have working silicon and are conducting customer trials. Chinese AI chip startup has received $100 million in funding. Cambricon Technologies aims to have one billion smart devices using its AI processor and own 30% of China's high-performance AI chip market in three years. Huawei estimates Cambricon chips are six times faster for deep-learning applications like training algorithms to identify images than a GPU.


Vertex.AI - Announcing PlaidML: Open Source Deep Learning for Every Platform

@machinelearnbot

We're pleased to announce the next step towards deep learning for every device and platform. Today Vertex.AI is releasing PlaidML, our open source portable deep learning engine. Our mission is to make deep learning accessible to every person on every device, and we're building PlaidML to help make that a reality. We're starting by supporting the most popular hardware and software already in the hands of developers, researchers, and students. The initial version of PlaidML runs on most existing PC hardware with OpenCL-capable GPUs from NVIDIA, AMD, or Intel.