AI everywhere


"We invented a computing model called GPU accelerated computing and we introduced it almost slightly over 10 years ago," Huang said, noting that while AI is only recently dominating tech news headlines, the company was working on the foundation long before that. Nvidia's tech now resides in many of the world's most powerful supercomputers, and the applications include fields that were once considered beyond the realm of modern computing capabilities. Now, Nvidia's graphics hardware occupies a more pivotal role, according to Huang – and the company's long list of high-profile partners, including Microsoft, Facebook and others, bears him out. GTC, in other words, has evolved into arguably the biggest developer event focused on artificial intelligence in the world.

Using TensorFlow in Windows with a GPU


In case you missed it, TensorFlow is now available for Windows, as well as Mac and Linux. This was not always the case. For most of TensorFlow's first year of existence, the only means of Windows support was virtualization, typically through Docker. Even without GPU support, this is great news for me. I teach a graduate course in deep learning and dealing with students who only run Windows was always difficult.

What To Expect in 2017 From AMD, INTEL, NVIDIA, XILINX And Others For Machine Learning


Without a doubt, 2016 was an amazing year for Machine Learning (ML) and Artificial Intelligence (AI). I have opined on the 5 things to watch in AI for 2017 in another article, however the potential dynamics during 2017 in processor and accelerator semiconductors that enable this market warrant further examination. It is interesting to note that shares of NVIDIA roughly tripled in 2016 due in large part to the company's technology leadership in this space. While NVIDIA GPUs currently enjoy a dominant position for Machine Learning training today, the company's latest quarter growth of 197% YoY, in a market now worth over a half billion dollars, has inevitably attracted a crowd of potential competitors, large and small. And semiconductors remain one of the few pure AI plays for public equity investors seeking a position in this fast growing market.

NVIDIA : Launches New SHIELD TV, The Most Advanced Streamer 4-Traders


LAS VEGAS, NV--(Marketwired - Jan 4, 2017) - CES -- NVIDIA (NASDAQ: NVDA) today unveiled the new NVIDIA SHIELD TV -- an Android open-platform media streamer built on bleeding-edge visual computing technology that delivers unmatched experiences in streaming, gaming and AI. Sporting a sleek, new design and now shipping with both a remote and a game controller, SHIELD provides the best, most complete entertainment experience in the living room. "NVIDIA's rich heritage in visual computing and deep learning has enabled us to create this revolutionary device," said Jen-Hsun Huang, founder and chief executive officer of NVIDIA, who revealed SHIELD during his opening keynote address at CES. "SHIELD TV is the world's most advanced streamer. Its brilliant 4K HDR quality, hallmark NVIDIA gaming performance and broad access to media content will bring families hours of joy. And with SHIELD's new AI home capability, we can control and interact with content through the magic of artificial intelligence from anywhere in the house," he said.

Proof that Moore's Law has been replaced by a Virtual Moore's Law that is Accelerating and Bringing the Singularity With It


Moore's Law says that the number of transistors per square inch will double approximately every 18 months. This article will show how many technologies are providing us with a new Virtual Moore's Law that proves computer performance will at least double every 18 months for the foreseeable future thanks to many new technological developments. This Virtual Moore's Law is propelling us towards the Singularity where the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization. In the first of my "proof" articles two years ago, I described how it has become harder to miniaturize transistors, causing computing to go vertical instead. 2 years ago, Samsung was mass producing 24-layer 3D NAND chips and had announced 32-layer chips. As I write this, Samsung is mass producing 48-layer 3D NAND chips with 64-layer chips rumored to appear within a month or so.

Artificial Intelligence, IBM, NVIDIA Driving Changes to Credit Cards, Health Care, Physical Security


More important to me is how this will change our lives. I spent some time last week talking to IBM about how its partnership with NVIDIA and its advancements with Watson and OpenPOWER will be changing the world around us. We spoke about a number of artificial intelligence trends and several stood out for me. Artificial Intelligence and Credit Card Security Every year, financial institutions write of billions in losses due to credit card fraud, and a great deal of focus has been placed on stopping this steady drip, drip, drip of illegal cost. Currently, systems are advanced enough to do four fraud checks at the time of the transaction, but they simply aren't enough to stop the flood of people cloning, stealing and skimming credit cards to steal money.

AMD chases the AI trend with its Radeon Instinct GPUs for machine learning


With the Radeon Instinct line, AMD joins Nvidia and Intel in the race to put its chips into AI applications--specifically, machine learning for everything from self-driving cars to art. The company plans to launch three products under the new brand in 2017, which include chips from all three of its GPU families. The passively cooled Radeon Instinct MI6 will be based on the company's Polaris architecture. It will offer 5.7 teraflops of performance and 224GBps of memory bandwidth, and will consume up to 150 watts of power. The small-form-factor, Fiji-based Radeon Instinct MI8 will provide 8.2 teraflops of performance and 512GBps of memory bandwidth, and will consume up to 175 watts of power.

Azure N-Series: General availability on December 1


I am really excited to announce that the general availability of the Azure N-Series will be December 1st, 2016. Azure N-Series virtual machines are powered by NVIDIA GPUs and provide customers and developers access to industry-leading accelerated computing and visualization experiences. I am also excited to announce global access to the sizes, with N-series available in South Central US, East US, West Europe and South East Asia, all available on December 1st. We've had thousands of customers participate in the N-Series preview since we launched it back in August. We've heard positive feedback on the enhanced performance and the work we have down with NVIDIA to make this a completely turnkey experience for you.

Intel touts Nervana AI platform as key to boosting machine learning speed


Intel has set out more plans for its focus on artificial intelligence (AI) and claimed that it will reduce the time to train a deep learning model by up to 100 times within the next three years. At the forefront of the firm's AI ambitions is the Intel Nervana platform, which was announced on Thursday following Intel's acquisition of deep learning startup Nervana Systems earlier this year. Setting its sights on an area currently dominated by Nvidia's GPU technology, one of the platform's main focuses will be deep learning and training neural networks. Intel claimed that its non-GPU tech will "deliver a 100-fold increase in performance that will turbocharge the pace of innovation in the emerging deep learning space". Intel will integrate Nervana's technology into its Xeon and Xeon Phi processor range.

Intel shares artificial intelligence strategy


Intel announced a slew of products, technologies and investment in an effort to fix its position in the field of artificial intelligence. In the new move, Intel has assembled a set of technology options to drive AI capabilities in everything from smart factories and drones to sports, fraud detection and autonomous cars. Intel is increasing its focus on AI as it believes it can power the AI products released recently by companies like Facebook and Google. In a blog Intel CEO Brian Krzanich had said, "Intel is uniquely capable of enabling and accelerating the promise of AI. Intel is committed to AI and is making major investments in technology and developer resources to advance AI for business and society."