Goto

Collaborating Authors

 accelerate machine learning


Using GPU to Accelerate Machine Learning

#artificialintelligence

Over the past decade, graphics processing units (GPUs) have claimed their spot as the leading processing powerhouse used in the artificial intelligence world, especially in Machine Intelligence and Deep Learning applications. How did GPUs solidify their place in the industry of computer automation and how have graphics cards changed the industry for solving problems that have nothing to do with graphics?


Improving Image Recognition to Accelerate Machine Learning

#artificialintelligence

Deep learning is a fascinating sub field of machine learning that creates artificially intelligent systems inspired by the structure and function of the brain. The basis of these models are bio-inspired artificial neural networks that mimic the neural connectivity of animal brains to carry out cognitive functions such as problem solving. A field with the most impressive results of neuromorphic computing is that of visual image analysis. Similar to how our brains learn to recognize objects in order to make predictions and act upon them, artificial intelligence must be shown millions of pictures before they are able to generalize them in order to make their best educated guesses for images they have never seen before. Professor Cheol Seong Hwang from the Department of Material Science and Engineering at Seoul National University and his research team have developed a method to accelerate the image recognition process by combining the inherent efficiency of resistive random access memory (ReRAM) and cross-bar array structures, two of the most commonly used hardware. Many of us have performed a reversed image search to find information based on a certain image in order to browse similar results.


Improving Image Recognition to Accelerate Machine Learning - Advanced Science News

#artificialintelligence

Deep learning is a fascinating sub field of machine learning that creates artificially intelligent systems inspired by the structure and function of the brain. The basis of these models are bio-inspired artificial neural networks that mimic the neural connectivity of animal brains to carry out cognitive functions such as problem solving. A field with the most impressive results of neuromorphic computing is that of visual image analysis. Similar to how our brains learn to recognize objects in order to make predictions and act upon them, artificial intelligence must be shown millions of pictures before they are able to generalize them in order to make their best educated guesses for images they have never seen before. Professor Cheol Seong Hwang from the Department of Material Science and Engineering at Seoul National University and his research team have developed a method to accelerate the image recognition process by combining the inherent efficiency of resistive random access memory (ReRAM) and cross-bar array structures, two of the most commonly used hardware. Many of us have performed a reversed image search to find information based on a certain image in order to browse similar results.


Spell - Join the Movement to Accelerate Machine Learning

#artificialintelligence

Community authors will receive $250 cloud GPU credits on the Spell platform in exchange for creating content that will live on the Spell Community site. This contribution from community authors helps grow Spell's library of data science, machine learning, and deep learning tutorials and educate others on the world of AI.


NVIDIA and VMware to Accelerate Machine Learning, Data Science and AI Workloads

#artificialintelligence

NVIDIA and VMware today announced their intent to deliver accelerated GPU services for VMware Cloud on AWS to power modern enterprise applications, including AI, machine learning and data analytics workflows. These services will enable customers to seamlessly migrate VMware vSphere-based applications and containers to the cloud, unchanged, where they can be modernized to take advantage of high-performance computing, machine learning, data analytics and video processing applications. Increasingly businesses are applying artificial intelligence (AI) technologies to differentiate and advance their processes and offerings. Enterprises are rapidly adopting AI(1) and implementing new AI strategies that require powerful computers to create predictive models from petabytes of corporate data. Across industries, enterprises are implementing machine learning applications such as image and voice recognition, advanced financial modeling and natural language processing using neural networks that rely on NVIDIA GPUs for faster training and real-time inference.


NVIDIA and VMware to Accelerate Machine Learning, Data Science and AI Workloads - DATAVERSITY

#artificialintelligence

According to a new press release, "NVIDIA and VMware today announced their intent to deliver accelerated GPU services for VMware Cloud on AWS to power modern enterprise applications, including AI, machine learning and data analytics workflows. These services will enable customers to seamlessly migrate VMware vSphere-based applications and containers to the cloud, unchanged, where they can be modernized to take advantage of high-performance computing, machine learning, data analytics and video processing applications." The release goes on, "Increasingly businesses are applying artificial intelligence (AI) technologies to differentiate and advance their processes and offerings. Enterprises are rapidly adopting AI1 and implementing new AI strategies that require powerful computers to create predictive models from petabytes of corporate data. Across industries, enterprises are implementing machine learning applications such as image and voice recognition, advanced financial modeling and natural language processing using neural networks that rely on NVIDIA GPUs for faster training and real-time inference. Additionally, VMware recently acquired Bitfusion, which enables VMware to efficiently make GPU capabilities available for AI and machine learning workloads in the enterprise."


NVIDIA and VMware to Accelerate Machine Learning, Data Science and AI Workloads on VMware Cloud on AWS Accelerated by NVIDIA GPUs

#artificialintelligence

VMworld--NVIDIA and VMware today announced their intent to deliver accelerated GPU services for VMware Cloud on AWS to power modern enterprise applications, including AI, machine learning and data analytics workflows. These services will enable customers to seamlessly migrate VMware vSphere-based applications and containers to the cloud, unchanged, where they can be modernized to take advantage of high-performance computing, machine learning, data analytics and video processing applications. Increasingly businesses are applying artificial intelligence (AI) technologies to differentiate and advance their processes and offerings. Enterprises are rapidly adopting AI(1) and implementing new AI strategies that require powerful computers to create predictive models from petabytes of corporate data. Across industries, enterprises are implementing machine learning applications such as image and voice recognition, advanced financial modeling and natural language processing using neural networks that rely on NVIDIA GPUs for faster training and real-time inference.


AI-powered robot mimics ANY action after watching it done just once

Daily Mail - Science & tech

A new breed of AI-powered robots could soon mimic almost any action after watching a human do them just once. Scientists have developed a clawed machine that can learn new tasks, such as dropping a ball into a bowl or picking up a cup, simply by viewing a person perform them first. Researchers said the trick allows the android to master new skills much faster than other robots, and could one day lead to machines capable of learning complex tasks purely through observation – much like humans and animals do. A new breed of AI-powered robots could soon mimic almost any action after watching a human do them just once. Project lead scientist scientist Tianhe Yu wrote in a blog post: 'Learning a new skill by observing another individual, the ability to imitate, is a key part of intelligence in human and animals.


DLT Partners with DataRobot to Accelerate Machine Learning and Data Science Initiatives in Public Sector

@machinelearnbot

DataRobot enables government users to build and deploy highly accurate machine learning models in a fraction of the time. "Identifying and hiring the top data scientists is hard for any organization but none more so than the federal government," said Erin Hawley, VP of Public Sector for DataRobot. "Our platform makes it easier than ever for an organization to adopt data science into their environment, at a fraction of the cost and time. We're excited to partner with DLT as it is their vision to bring together a community of data science partners that will be a key advantage to federal government agencies." Recently launched, the DLT AnalyticsStackTM is a public sector focused Big Data, Analytics and Data Science'solution stack' providing agencies with a scalable reference framework that addresses rapidly evolving big data requirements and use cases.