Nvidia


Reports Say Fujitsu, Huawei Developing Artificial Intelligence Chips

#artificialintelligence

System makers Fujitsu and Huawei Technologies reportedly are both planning to develop processors optimized for artificial intelligence workloads, moves that will put them into competition with the likes of Intel, Google, Nvidia and Advanced Micro Devices. Tech vendors are pushing hard to bring artificial intelligence (AI) and deep learning capabilities into their portfolios to meet the growing demand generated by a broad range of workloads, from data analytics to self-driving vehicles. Fujitsu engineers for the past couple of years have been working on what the company is calling a deep learning unit (DLU), but last month gave more details on the component during the International Supercomputing show. The chip reportedly will include 16 deep learning processing elements, with each of them housing eight single-instruction, multiple data execution units.


Artificial Intelligence - what problems can it solve?

#artificialintelligence

The event was part of a series that Odgers Berndtson is running in partnership with technology company NVIDIA as part of their Inception Program. Odgers Berndtson's partnership on the events – known as'Inception Connect' – aims to expose this innovation to new audiences. Speaking about the event series, Odgers Berndtson's Head of the Technology Practice Michael Drew said: "We regularly speak to business leaders who wish to remain ahead of the curve by discovering new AI solutions. The Odgers Berndtson Technology Practice is currently planning the next events focusing on other verticals scheduled for later in 2017.


Nvidia and the GPU: contribution to the AI world of self-driving cars

#artificialintelligence

In other words, GPU delivers better prediction accuracy, faster results, smaller footprint, lower power and lower costs. What is fascinating about Nvidia is that it has a full stack solution architecture for DL applications, making it easier and faster for data scientist engineers to deploy their programs. As part of a complete software stack for autonomous driving, NVIDIA created a neural-network-based system, known as PilotNet, which outputs steering angles given images of the road ahead. In addition to learning the obvious features such as lane markings, edges of roads, and other cars, PilotNet learns more subtle features that would be hard to anticipate and program by engineers, for example, bushes lining the edge of the road and atypical vehicle classes (Source:Cornell university CS department).


The hidden horse power driving Machine Learning models

#artificialintelligence

This will typically learn in 100 epochs fairly good recommendations for movies. Companies are starting to offer hardware that can be situated close to the data production (in terms of network speed) for machine learning. It is for this reason that companies are starting to offer hardware that can be situated close to the data production (in terms of network speed) for machine learning. To get an idea of its speed, a researcher loaded up the Imagenet 2012 dataset and trained a Resnet50 machine learning model on the dataset.


Nvidia CEO: "Software is eating the world, but AI is going to eat software"

#artificialintelligence

Nvidia has benefitted from a rapid explosion of investment in machine learning from tech companies. Can this rapid growth in the use cases for machine learning continue? Recent research results from applying machine learning to diagnosis are impressive (see "An AI Ophthalmologist Shows How Machine Learning May Transform Medicine"). Your chips are already driving some cars: all Tesla vehicles now use Nvidia's Drive PX 2 computer to power the Autopilot feature that automates highway driving.


Artificial Emotional Intelligence For Web Browsing NVIDIA Blog

#artificialintelligence

The first public pilot UK-based Emotions.Tech's artificial emotional intelligence, launched in May, allows users to search according to how they want the results to make them feel. "We need that acceleration to keep up with the complexities of human emotion," Tero says. To do that, Emotions.Tech turned to GPU-powered deep learning to rank, list and search web pages according to their emotional content. They then use this data to train artificial neural networks.


What is artificial intelligence and why it'll be a $45 billion business in 2020

#artificialintelligence

This training process takes a huge amount of computer process to fine tune. The huge amount of processing power required to run and train AI systems is what has kept AI research relatively quiet until recently, which leads to the next question. AI systems require lots of real-world examples to be trained well, like lots of cat photos for example. Tesla, Google, Apple and many of the traditional car companies are training AI systems for autonomous driving.


Learning Machine Learning

#artificialintelligence

Massive Open Online Courses (MOOCs) are a good starting point, with a lot to offer. The article entitled "Top Machine Learning MOOCs and Online Lectures: A Comprehensive Survey" lists a number of good resources. For example, the MXNet website lists a number of data set sources for CNNs and RNNs. Intel's Python-based Neon framework from Nervana, now an Intel company, supports platforms like Apache Spark, TensorFlow, Caffe, and Theano.


3 Ways AI Can Boost NVIDIA -- The Motley Fool

#artificialintelligence

The graphics specialist has been applying its graphics processing units (GPUs) to train AI models, setting itself up to tap an AI chip market that could be worth $16 billion in 2022, according to Markets and Markets. The company launched its first-generation DRIVE PX platform two years ago, hoping to partner with automakers and develop self-driving cars. All of these partnerships have pushed NVIDIA's automotive revenue from just $56 million at the end of fiscal year 2015 to $140 million in the first quarter of fiscal 2018. NVIDIA saw this trend early and launched its Tesla GPU accelerators around five years ago for supercomputing applications.


NVIDIA Invests in Cyber Security Startup Deep Instinct The Official NVIDIA Blog

#artificialintelligence

In the latest of a series of investments in deep learning startups, NVIDIA is investing in Deep Instinct, an Israeli-based startup that uses deep learning to thwart cyberattacks. Deep Instinct was recently named a "Technology Pioneer" by The World Economic Forum, and the "Most Disruptive Startup" at NVIDIA's 2017 Inception Awards. NVIDIA recently expanded its portfolio of startup investments, adding eight companies in four countries over the past year. NVIDIA's investment portfolio is only one element of the company's extensive involvement in the AI ecosystem.