Results


Why the AI hype cycle won't end anytime soon

#artificialintelligence

Increasingly affordable AI maintenance and the increased speed of calculations thanks to GPU are significant factors in the unbridled growth of AI. The astonishing results that were achieved on training a neural network on GPU cards made Nvidia a key player, with 70 percent of the market share that Intel failed to gain. Compared with the results from the analog algorithms, and thanks to the combination of machine learning and big data, previously "unsolvable" problems are now being solved. Machine learning algorithms can directly analyze thousands of previous cases of different types of diseases and make their own conclusions as to what constitutes a sick individual versus a healthy individual, and consequently help diagnose dangerous conditions including cancer.


Intel shares artificial intelligence strategy

#artificialintelligence

Intel announced a slew of products, technologies and investment in an effort to fix its position in the field of artificial intelligence. In the new move, Intel has assembled a set of technology options to drive AI capabilities in everything from smart factories and drones to sports, fraud detection and autonomous cars. Intel is increasing its focus on AI as it believes it can power the AI products released recently by companies like Facebook and Google. In a blog Intel CEO Brian Krzanich had said, "Intel is uniquely capable of enabling and accelerating the promise of AI. Intel is committed to AI and is making major investments in technology and developer resources to advance AI for business and society."


[session] Bert Loomis and AI in the Cloud By @IBMCloud @CloudExpo #AI #Cloud #DigitalTransformation

#artificialintelligence

Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, will discuss the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They will also review two "free infrastructure" programs available to startups and innovators. Speaker Bios Harold Hannon has worked in the field of software development as both an architect and developer for more than 15 years, with a focus on workflow, integration, and distributed systems.


Nvidia touts GPU processing as the future of big data

ZDNet

Nvidia is gearing up to tap into the big data business, in which Mark Patane, Nvidia ANZ country manager, has described will be a trillion dollar business over the next few years. Speaking to ZDNet, Patane believes graphic processing units (GPU) will be a key solution to helping commercial businesses get through analysing their big data, and pointed out Nvidia has been working with the likes of Facebook and Google over the last two years to help them process their data. "They came to us because they realise you can't use your normal every day computers because this data is just too much. We've been working with them for a couple of years using GPU," he said. Dr Jon Barker, Nvidia machine learning solutions architect lead, further explained businesses are increasingly tasked with figuring out how to efficiently process all the data they are collecting, highlighting there are 2.5 exabytes of digital data produced daily, and that data is expected to double every three years.


GPU-Powered Deep Learning Emerges to Carry Big Data Torch Forward

#artificialintelligence

The AI industry has exploded thanks to deep learning algorithms running atop GPU, says NVIDIA's Huang So far, deep learning has primarily been the domain of major tech firms, such as Google (NASDAQ: GOOG) and Baidu (NASDAQ: BIDU) which employ the algorithms on massive GPU-powered clusters to power various services they make available over the Web, such as image recognition or speech recognition. Other large companies like Amazon Web Services (NASDAQ: AMZN), Microsoft (NASDAQ: MSFT), and IBM (NYSE: IBM) have also made significant investments in deep learning. Underpinning many of these startups are investments in deep learning algorithms, which has emerged as the key technology driving the emerging artificial intelligence/cognitive computing industry that analysts say will be worth upwards of half a trillion dollars in the next few years. As of yet, we've yet to make full use of deep learning algorithms in the context of big data analytics, according to the writers of the aforementioned Journal of Big Data article "Deep learning applications and challenges in big data analytics."


NVIDIA Deep Learning Tech Talk at Northwestern University

#artificialintelligence

Jon Barker: Jon Barker is a Solution Architect with NVIDIA, helping customers and partners develop applications of GPU-accelerated machine learning and data analytics to solve defense and national security problems. He is particularly focused on applications of the rapidly developing field of deep learning. Prior to joining NVIDIA, Jon spent almost a decade as a government research scientist within the U.K. Ministry of Defence and the U.S. Department of Defense R&D communities. While in government service, he led R&D projects in sensor data fusion, big data analytics, and machine learning for multi-modal sensor data to support military situational awareness and aid decision making.


What you missed in Big Data: The rise of deep learning

#artificialintelligence

Last week saw Nvidia Inc. add its name to the long list of vendors trying to monetize the trend by introducing a new graphical processing unit specifically designed to run deep learning algorithms. The retail giant's interest in the technology was revealed to be much broader than previously believed last week after Bloomberg reported it's quietly acquired Orbeus Inc., a low-key startup that has built a deep learning service for performing automated object recognition. The acquisition came against the backdrop of another analytics outfit called SurveyMonkey Inc. announcing some exciting news of its own: The launch of its app monitoring service. You may use these HTML tags and attributes: a href "" title "" abbr title "" acronym title "" b blockquote cite "" cite code del datetime "" em i q cite "" s strike strong


Michele Goetz' Blog

#artificialintelligence

Each emotion illustrates what everyone will experience shortly on NVIDIA's next gen compute platform with announcement for AI, VR, self-driving, SDK and new deep learning appliance. This is not your traditional or even big data analytic platform. Stepping back from what may seem like hype and examples steeped in robotics, VR and infrastructure, the truth is, the announcements today show that deep learning in action is at most a year away, and as soon as now. Keep these five strategy shifts in mind as you introduce AI compute platforms into your organization.


NVIDIA, Massachusetts General Hospital Use Artificial Intelligence to Advance Radiology, Pathology, Genomics

#artificialintelligence

SAN JOSE, CA--(Marketwired - Apr 5, 2016) - GPU Technology Conference -- NVIDIA (NASDAQ: NVDA) today announced that it is a founding technology partner of the MGH Clinical Data Science Center, which aims to advance healthcare by applying the latest artificial intelligence techniques to improve the detection, diagnosis, treatment and management of diseases. Massachusetts General Hospital -- which conducts the largest hospital-based research program in the United States, and is the top-ranked hospital on this year's US News and World Report "Best Hospitals" list -- recently established the MGH Clinical Data Science Center in Boston. To process this massive amount of data, the center will deploy the NVIDIA DGX-1 -- a server designed for AI applications, launched earlier today at the GPU Technology Conference -- and deep learning algorithms created by NVIDIA engineers and Mass General data scientists. Initially, the MGH Clinical Data Science Center will focus on the fields of radiology and pathology -- which are particularly rich in images and data -- and then expand into genomics and electronic health records.