Results


How AI Protects PayPal's Payments and Performance The Official NVIDIA Blog

#artificialintelligence

With advances in machine learning and the deployments of neural networks, logistic regression-powered models are expanding their uses throughout PayPal. PayPal's deep learning system is able to filter out deceptive merchants and crack down on sales of illegal products. Kutsyy explained the machines can identify "why transactions fail, monitoring businesses more efficiently," avoiding the need to buy more hardware for problem solving. The AI Podcast is available through iTunes, DoggCatcher, Google Play Music, Overcast, PlayerFM, Podbay, Pocket Casts, PodCruncher, PodKicker, Stitcher and Soundcloud.


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


Moore's Law may be out of steam, but the power of artificial intelligence is accelerating

#artificialintelligence

A paper from Google's researchers says they simultaneously used as many as 800 of the powerful and expensive graphics processors that have been crucial to the recent uptick in the power of machine learning (see "10 Breakthrough Technologies 2013: Deep Learning"). Feeding data into deep learning software to train it for a particular task is much more resource intensive than running the system afterwards, but that still takes significant oomph. Intel has slowed the pace at which it introduces generations of new chips with smaller, denser transistors (see "Moore's Law Is Dead. It also motivates the startups--and giants such as Google--creating new chips customized to power machine learning (see "Google Reveals a Powerful New AI Chip and Supercomputer").


LG Pushes Smart Home Appliances to Another Dimension with Deep Learning Technology - Dealerscope

#artificialintelligence

To advance the functionality of today's home appliances to a whole new level, LG Electronics (LG) is set to deliver an unparalleled level of performance and convenience into the home with deep learning technology to be unveiled at CES 2017. LG deep learning will allow home appliances to better understand their users by gathering and studying customers' lifestyle patterns over time. This process never ends and improves over time to provide customers with new solutions to everyday problems. Using multiple sensors and LG's deep learning technology, LG's newest robot vacuum cleaner will recognize objects around the room and react accordingly. By capturing surface images of the room, the intelligent cleaner remembers obstacles and learns to avoid them over time.


Intel's Optimized Tools and Frameworks for Machine Learning and Deep Learning

#artificialintelligence

This article gives an introduction to the Intel's optimized machine learning and deep learning tools and frameworks and also gives a description of the Intel's libraries that have been integrated into them so they can take full advantage and run fastest on Intel architecture. This information will be useful to first-time users, data scientists, and machine learning practitioners, for getting started with Intel optimized tools and frameworks. Machine learning (ML) is a subset of the more general field of artificial intelligence (AI). ML is based on a set of algorithms that learn from data. Deep learning (DL) is a specialized ML technique that is based on a set of algorithms that attempt to model high-level abstractions in data by using a graph with multiple processing layers (https://en.wikipedia.org/wiki/Deep_learning).


How to Get Started as a Developer in AI

#artificialintelligence

The promise of artificial intelligence has captured our cultural imagination since at least the 1950s--inspiring computer scientists to create new and increasingly complex technologies, while also building excitement about the future among regular everyday consumers. What if we could explore the bottom of the ocean without taking any physical risks? While our understanding of AI--and what's possible--has changed over the the past few decades, we have reason to believe that the age of artificial intelligence may finally be here. So, as a developer, what can you do to get started? While there are a lot of different ways to think about AI and a lot of different techniques to approach it, the key to machine intelligence is that it must be able to sense, reason, and act, then adapt based on experience.


Just How Smart Are Smart Machines?

#artificialintelligence

If popular culture is an accurate gauge of what's on the public's mind, it seems everyone has suddenly awakened to the threat of smart machines. Several recent films have featured robots with scary abilities to outthink and manipulate humans. In the economics literature, too, there has been a surge of concern about the potential for soaring unemployment as software becomes increasingly capable of decision making. Yet managers we talk to don't expect to see machines displacing knowledge workers anytime soon -- they expect computing technology to augment rather than replace the work of humans. In the face of a sprawling and fast-evolving set of opportunities, their challenge is figuring out what forms the augmentation should take.


Cray and Microsoft accelerate deep learning training to minutes instead of weeks

#artificialintelligence

A team of researchers from Microsoft, Cray, and the Swiss National Supercomputing Centre (CSCS) have been working on a project to speed up the use of deep learning algorithms on supercomputers. They accelerated the training process. Instead of waiting weeks or months for results, data scientists can obtain results within hours or even minutes. With the introduction of supercomputing architectures and technologies to deep learning frameworks, customers now have the ability to solve a whole new class of problems, such as moving from image recognition to video recognition, and from simple speech recognition to natural language processing with context. The team have scaled the Microsoft Cognitive Toolkit -- an open-source suite that trains deep learning algorithms -- to more than 1,000 Nvidia Tesla P100 GPU accelerators on the Swiss centre's Cray XC50 supercomputer, which is nicknamed Piz Daint.


AMD chases the AI trend with its Radeon Instinct GPUs for machine learning

PCWorld

With the Radeon Instinct line, AMD joins Nvidia and Intel in the race to put its chips into AI applications--specifically, machine learning for everything from self-driving cars to art. The company plans to launch three products under the new brand in 2017, which include chips from all three of its GPU families. The passively cooled Radeon Instinct MI6 will be based on the company's Polaris architecture. It will offer 5.7 teraflops of performance and 224GBps of memory bandwidth, and will consume up to 150 watts of power. The small-form-factor, Fiji-based Radeon Instinct MI8 will provide 8.2 teraflops of performance and 512GBps of memory bandwidth, and will consume up to 175 watts of power.


The artificially intelligent eye doctor is in

#artificialintelligence

Google researchers got an eye-scanning algorithm to figure out on its own how to detect a common form of blindness, showing the potential for artificial intelligence to transform medicine remarkably soon. The algorithm can look at retinal images and detect diabetic retinopathy--which affects almost a third of diabetes patients--as well as a highly trained ophthalmologist can. It makes use of the same machine-learning technique that Google uses to label millions of Web images. Diabetic retinopathy is caused by damage to blood vessels in the eye and results in a gradual deterioration of vision. If caught early it can be treated, but a sufferer may experience no symptoms early on, making screening vital.