Results


Amazon's new GPU-cloud wants to chew through your AI and big data projects ZDNet

#artificialintelligence

Amazon Web Services (AWS) has unveiled a new GPU-powered cloud computing service for artificial intelligence, seismic analysis, molecular modeling, genomics, and other applications that need vast amounts of parallel processing power. AWS said its P2 instances for Amazon Elastic Compute Cloud (Amazon EC2) are aimed at applications that require "massive parallel floating point performance" . "These instances were designed to chew through tough, large-scale machine learning, deep learning, computational fluid dynamics, seismic analysis, molecular modeling, genomics, and computational finance workloads," said Jeff Barr, chief evangelist at AWS. While GPUs were first associated with gaming, they're now finding a new life in dealing with huge computing workloads, as they can be scaled out so that banks of GPUs handle tasks in parallel. This is in contrast to the traditional approach of scaling up, where increasingly complex problems were tackled using individual machines with ever faster CPUs, which is becoming increasingly hard to do.


Artificial intelligence: mind games » Banking Technology

#artificialintelligence

Artificial intelligence (AI) isn't new but the rise of mobile and cloud computing, combined with big data and cheap computing power, is driving a resurgence. Convergent technologies mean AI is finding new uses in financial services. AI will be used in "every single segment of financial services", predicts Christophe Chazot, group head of innovation, HSBC. "The software is getting more intelligent in a human sense, mimicking human reasoning." The technology can help wealth advisors, back office staff and operations, traders and corporate finance teams.


Spark for Scale: Machine Learning for Big Data

#artificialintelligence

Recently we shared an introduction to machine learning. While making machines learn from data is fun, the data from real-world scenarios often gets out of hand if you try to implement traditional machine-learning techniques on your computer. To actually use machine learning with big data, it's crucial to learn how to deal with data that is too big to store or compute on a single computing machine. Today we will discuss fundamental concepts for working with big data using distributed computing, then introduce the tools you need to build machine learning models. We'll start with some naive methods of solving problems, which are meant only as an example.


Artificial Intelligence vs. Deep Learning vs. Big Data - Nanalyze

#artificialintelligence

Computing was some pretty exciting stuff for those of us back in the 80s who still remember the first time we booted up our 386DX. While nobody could really say what the advantages of the "DX" were, better at math or something, we still ponied up the extra 200 USD to pick up that 386DX 16Mhz along with a Super VGA graphics card, then hooked that bad boy up to CompuServe via our lightning fast 14,400 baud U.S. Robotics "Sportster" modem. That was well before Al Gore created the Internet, and a lot has changed since then. So are we, so let's go through and define some of these terms and what they mean for investors. "The Cloud" – The idea here is that instead of purchasing applications then installing them onto a computer, you lease the applications on demand and access them over the internet.


New processors in the cloud accelerate AI and big data

#artificialintelligence

One of the ironies of cloud computing is that, the more its cookie-cutter architecture allows it to scale, then the more it becomes possible to reintroduce diversity into that architecture. While cloud computing's early successes came from working out how to automatically manage serried ranks of identical general-purpose commodity computers, the vast scale that providers have now reached makes it economically and operationally viable to introduce pools of specialized computing into that infrastructure. Nowhere is this more pertinent than in the analysis of huge volumes of data, in applications such as big data and, increasingly, machine learning. In search of high performance in these applications, the leading cloud providers are investing in increasingly esoteric processor technologies, even to the extent of custom-building their own designs, as both Google and IBM are known to have done, while Microsoft and Amazon are thought to be doing the same -- perhaps also Apple and Facebook. Google's Tensor Processing Unit (TPU) is a case in point.


AI and the IoT: Are We Truly Prepared for What's Coming?

#artificialintelligence

The enterprise, as always, is at the forefront of virtually all the multiple technology revolutions taking place today. From Big Data and the Internet of Things to virtual infrastructure and digital business processes, IT is driving the transformation from old-style systems and infrastructure to highly available, highly intelligent applications and services. But sometimes it helps to stop for a moment and see where all this is headed and what work, and life, would be like if all of these developments come to fruition. To my mind, the most consequential advancements are coming in the areas of the IoT and artificial intelligence. How, exactly, will the world function once it has access to a global, interconnected computing environment that touches every device on the planet?


The Machine Learning Advantage

#artificialintelligence

Machine learning is, to keep it simple, an algorithm developed to note changes in data and evolve in it's design to accommodate the new findings. As applied to predictive analytics, this feature has wide ranging impact on the activities normally undertaken to develop, test, and refine an algorithm for a given purpose. Sophisticated pattern recognition – Along with noting relationships, the Yottamine Predictive Platform can determine the type and quantify as well. This is not just happening with key, or even secondary variables, but on every relationship that takes part in the pattern. This feature delineates irrelevant data as well, which provides the benefits of mitigating pre-processing requirements and accelerating processing.


Nvidia unifies a big list of developer tools in one package

PCWorld

Nvidia's varied range of GPU developer tools have been spread out into specialized kits, but that's not the case any more. The company has announced the Nvidia SDK unified toolkit, which brings together its game development, supercomputing, virtual reality, automotive and drone and robot development tools into one package. The toolkit brings together essential tools and libraries necessary for GPU development, Jen-Hsun Huang, CEO of Nvidia, said during a keynote at the company's GPU Technology Conference in San Jose, California. The toolkit is tuned for Nvidia's latest Pascal GPU architecture, which the company is expected to detail at the show. Pascal contains many technological improvements that could trigger changes in the way applications are written for GPUs.


Google edges into cloud analytics, big data, machine learning alongside Amazon, IBM, Microsoft

#artificialintelligence

Google jumped into the emerging space for analytics and big data when it revealed the new Cloud Machine Learning suite of services. "There's a new architecture emerging," Eric Schmidt, executive chairman of Google parent Alphabet, said at Google's GCP Next last week. "In a year, you will use machine learning to do something better than humans have been doing. Schmidt is not alone in that thinking. Google rivals Amazon, IBM, and Microsoft, in fact, have made similar cloud computing moves of late.


When big data gets too big, this machine-learning algorithm may be the answer

#artificialintelligence

Big data may hold a world of untapped potential, but what happens when your data set is bigger than your processing power can handle? A new algorithm that taps quantum computing may be able to help. That's according to researchers from MIT, the University of Waterloo and the University of Southern California who published a paper Monday describing a new approach to handling massively complex problems. By combining quantum computing and topology -- a branch of geometry -- the new machine-learning algorithm can streamline highly complex problems and put solutions within closer reach. Topology focuses on properties that stay the same even when something is bent and stretched, and it's particularly useful for analyzing the connections in complex networks, such as the U.S. power grid or the global interconnections of the Internet.